ISSN 1000-1239 CN 11-1777/TP

计算机研究与发展 ›› 2020, Vol. 57 ›› Issue (12): 2583-2595.doi: 10.7544/issn1000-1239.2020.20190854

• 人工智能 • 上一篇    下一篇

基于注意力机制的多通道CNN和BiGRU的文本情感倾向性分析

程艳1,尧磊波1,张光河1,唐天伟2,项国雄3,陈豪迈4,冯悦1,蔡壮1   

  1. 1(江西师范大学计算机信息工程学院 南昌 330022);2(江西师范大学管理决策评价研究中心 南昌 330022);3(江西师范大学新闻与传播学院 南昌 330022);4(豫章师范学院数学与计算机学院 南昌 330022) (chyan88888@jxnu.edu.cn)
  • 出版日期: 2020-12-01
  • 基金资助: 
    国家自然科学基金项目(61967011);江西省自然科学基金项目(20202BABL202033);江西省重点研发计划项目(20161BBE50086);江西省教育厅科技重点项目(GJJ150299);教育厅人文社科重点(重大)项目(JD19056)

Text Sentiment Orientation Analysis of Multi-Channels CNN and BiGRU Based on Attention Mechanism

Cheng Yan1, Yao Leibo1, Zhang Guanghe1, Tang Tianwei2, Xiang Guoxiong3, Chen Haomai4, Feng Yue1, Cai Zhuang1   

  1. 1(School of Computer Information Engineering, Jiangxi Normal University, Nanchang 330022);2(Center of Management Decision Evaluation Research, Jiangxi Normal University, Nanchang 330022);3(School of Journalism and Communication, Jiangxi Normal University, Nanchang 330022);4(School of Mathematics and Computer, Yuzhang Normal University, Nanchang 330022)
  • Online: 2020-12-01
  • Supported by: 
    This work was supported by the National Natural Science Foundation of China (61967011), the Natural Science Foundation Project of Jiangxi Province (20202BABL202033), the Primary Research and Development Program of Jiangxi Province (20161BBE50086), the Science and Technology Key Project of Education Department of Jiangxi Province (GJJ150299), and the Humanities and Social Sciences Key (Major) Project of the Education Department (JD19056).

摘要: 近年来,卷积神经网络(convolutional neural network, CNN)和循环神经网络(recurrent neural network, RNN)已在文本情感分析领域得到广泛应用,并取得了不错的效果.然而,文本之间存在上下文依赖问题,虽然CNN能提取到句子连续词间的局部信息,但是会忽略词语之间上下文语义信息;双向门控循环单元(bidirectional gated recurrent unit, BiGRU)网络不仅能够解决传统RNN模型存在的梯度消失或梯度爆炸问题,而且还能很好地弥补CNN不能有效提取长文本的上下文语义信息的缺陷,但却无法像CNN那样很好地提取句子局部特征.因此提出一种基于注意力机制的多通道CNN和双向门控循环单元(MC-AttCNN-AttBiGRU)的神经网络模型.该模型不仅能够通过注意力机制关注到句子中对情感极性分类重要的词语,而且结合了CNN提取文本局部特征和BiGRU网络提取长文本上下文语义信息的优势,提高了模型的文本特征提取能力.在谭松波酒店评论数据集和IMDB数据集上的实验结果表明:提出的模型相较于其他几种基线模型可以提取到更丰富的文本特征,可以取得比其他基线模型更好的分类效果.

关键词: 卷积神经网络, 文本情感倾向性分析, 双向门控循环单元, 注意力机制, 多通道

Abstract: CNN(convolutional neural network) and RNN(recurrent neural network) have been widely used in the field of text sentiment analysis and have achieved good results in recent years. However, there is a problem of contextual dependency between texts, although CNN can extract local features between consecutive words of a sentence, it ignores the contextual semantic information between words. BiGRU(bidirectional gated recurrent unit) network can not only solve the problem of gradient disappearance or gradient explosion in traditional RNN model, but also make up for the shortcomings that CNN can’t effectively extract contextual semantic information of long text, while it can’t extract local features as well as CNN. Therefore, this paper proposes a MC-AttCNN-AttBiGRU(multi-channels CNN and BiGRU network based on attention mechanism) model. The model can notice the important words for sentiment classification in the sentence. It combines the advantages of CNN to extract local features of text and BiGRU network to extract contextual semantic information of long text, which improves the text feature extraction ability of the model. The experimental results on the Tan Songbo Hotel Review dataset and IMDB dataset show that the proposed model can extract richer text features than other baseline models, and can achieve better classification results than other baseline models.

Key words: CNN(convolutional neural network), text sentiment orientation analysis, BiGRU(bidirectional gated recurrent unit), attention mechanism, multi-channels

中图分类号: