Cross-Domain Text Generation Method Based on Semantic Conduction of Intermediate Domains
-
Graphical Abstract
-
Abstract
The deep neural network has been widely used in natural language processing. In text generation tasks with multi-domain data, there is often a discrepancy of data in different domains. And the introduction of new domains can simultaneously bring about the problem of data deficiency. The supervised methods require a large amount of data containing ground-truth in the domain of the task to train a deep neural network text generation model, and the trained model cannot achieve good generalization in a new domain. To address the problems of data distribution differences and data deficiency in multi-domain tasks, a comprehensive transfer text generation method inspired by transfer learning methods is designed to reduce the data distribution differences in text data between different domains while leveraging the semantic correlation on text data between source domain and target domain to help deep neural network text generation models generalize over new domains. The effectiveness of the proposed method for domain transfer is verified through experiments on a publicly available dataset, and the transfer deep neural network text generation model has a better performance in text generation on new domains. Also, the proposed method improves in all text generation evaluation metrics compared with other existing transfer text generation methods.
-
-