ISSN 1000-1239 CN 11-1777/TP

• 人工智能 •

### 基于多注意力卷积神经网络的特定目标情感分析

1. 1(苏州大学计算机科学与技术学院 江苏苏州 215000);2(软件新技术与产业化协同创新中心 南京 210000);3(符号计算与知识工程教育部重点实验室(吉林大学) 长春 130012) (bliang@stu.suda.edu.cn)
• 出版日期: 2017-08-01
• 基金资助:
国家自然科学基金项目(61272005,61303108,61373094,61472262,61502323,61502329)；江苏省自然科学基金项目(BK2012616)；江苏省高校自然科学研究项目(13KJB520020)；吉林大学符号计算与知识工程教育部重点实验室基金项目(93K172014K04)

### Aspect-Based Sentiment Analysis Based on Multi-Attention CNN

Liang Bin1, Liu Quan1,2,3, Xu Jin1, Zhou Qian1, Zhang Peng1

1. 1(College of Computer Science and Technology, Soochow University, Suzhou, Jiangsu 215000);2(Collaborative Innovation Center of Novel Software Technology and Industrialization, Nanjing 210000);3(Key Laboratory of Symbolic Computation and Knowledge Engineering (Jilin University), Ministry of Education, Changchun 130012)
• Online: 2017-08-01

Abstract: Unlike general sentiment analysis, aspect-based sentiment classification aims to infer the sentiment polarity of a sentence depending not only on the context but also on the aspect. For example, in sentence “The food was very good, but the service at that restaurant was dreadful”, for aspect “food”, the sentiment polarity is positive while the sentiment polarity of aspect “service” is negative. Even in the same sentence, sentiment polarity could be absolutely opposite when focusing on different aspects, so we need to infer the sentiment polarities of different aspects correctly. The attention mechanism is a good way for aspect-based sentiment classification. In current research, however, the attention mechanism is more combined with RNN or LSTM networks. Such neural network-based architectures generally rely on complex structures and cannot parallelize over the words of a sentence. To address the above problems, this paper proposes a multi-attention convolutional neural networks (MATT-CNN) for aspect-based sentiment classification. This approach can capture deeper level sentiment information and distinguish sentiment polarity of different aspects explicitly through a multi-attention mechanism without using any external parsing results. Experiments on the SemEval2014 and Automotive-domain datasets show that, our approach achieves better performance than traditional CNN, attention-based CNN and attention-based LSTM.