ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2019, Vol. 56 ›› Issue (11): 2384-2395.doi: 10.7544/issn1000-1239.2019.20180823

Previous Articles     Next Articles

Aspect-Based Sentiment Analysis Model Based on Dual-Attention Networks

Sun Xiaowan1,3, Wang Ying1,2,3, Wang Xin3,4, Sun Yudong2,3   

  1. 1(College of Software, Jilin University, Changchun 130012);2(College of Computer Science and Technology, Jilin University, Changchun 130012);3(Key Laboratory of Symbol Computation and Knowledge Engineering(Jilin University), Ministry of Education, Changchun 130012);4(College of Computer Technology and Engineering, Changchun Institute of Technology, Changchun 130012)
  • Online:2019-11-12

Abstract: Aspect-based sentiment analysis has become one of the hottest research issues in the field of natural language processing. It identifies the aspect sentiment polarity of texts by learning from context information, which can effectively help people understand the sentiment expression on different aspects. Currently, the most models with combining attention mechanism and neural network only consider a single level of attention information. When solving aspect-based sentiment analysis tasks, theses models have a lot of limitations. The convolutional neural network cannot capture the global structural information. For the recurrent neural network, the training time-consuming is too long, and the degree of dependence between words gradually decreases as the distance increases. To solve the above problems, we propose the dual-attention networks for aspect-level sentiment analysis (DANSA) model. Firstly, by introducing the multi-head attention mechanism, the model performs multiple linear transformation on the input to obtain more comprehensive attention information, which can realize parallel computing and enhance the training speed. Secondly, the self-attention mechanism is introduced to obtain global structural information by calculating the attention scores between each word and all other words in the input, and the degree of dependence between words is not affected by time and sentence length. Finally, the model makes a prediction of aspects sentiment polarity by combining the context self-attention information and the aspect of the word attention information. The extensive experiments on the SemEval2014 datasets and the Twitter datasets show that the DANSA achieves better classification performance, which further demonstrates the validity of DANSA.

Key words: aspect-based sentiment analysis (ABSA), self-attention, multi-head attention, dual-attention networks, natural language processing (NLP)

CLC Number: