ISSN 1000-1239 CN 11-1777/TP

Journal of Computer Research and Development ›› 2019, Vol. 56 ›› Issue (8): 1731-1745.doi: 10.7544/issn1000-1239.2019.20190102

Special Issue: 2019人工智能前沿进展专题

Previous Articles     Next Articles

Aspect-Level Sentiment Classification for Sentences Based on Dependency Tree and Distance Attention

Su Jindian1, Ouyang Zhifan1, Yu Shanshan2   

  1. 1(College of Computer Science and Engineering, South China University of Technology, Guangzhou 510640);2(College of Medical Information Engineering, Guangdong Pharmaceutical University, Guangzhou 510006)
  • Online:2019-08-01

Abstract: Current attention-based approaches for aspect-level sentiment classification usually neglect the contexts of aspects and the distance feature between words and aspects, which as a result make it difficult for attention mechanism to learn suitable attention weights. To address this problem, a dependency tree and distance attention-based model DTDA for aspect-level sentiment classification is proposed. Firstly, DTDA extracts dependency subtree (aspect sub-sentence) that contains the modification information of the aspect with the help of dependency tree of sentences, and then uses bidirectional GRU networks to learn the contexts of sentence and aspects. After that, the position weights are determined according to the syntactic distance between words and aspect along their path on the dependency tree, which are then further combined with relative distance to build sentence representations that contain semantic and distance information. The aspect-related sentiment feature representations are finally generated via attention mechanism and merged with sentence-related contexts, which are fed to a softmax layer for classification. Experimental results show that DTDA achieves comparable results with those current state-of-the-art methods on the two benchmark datasets of SemEval 2014, Laptop and Restaurant. When using word vectors pre-trained on domain-relative data, DTDA achieves the results with the precision of 77.01% on Laptop and 81.68% on Restaurant.

Key words: deep learning, aspect-level sentiment classification, attention, dependency tree, natural language processing

CLC Number: