Abstract:Aspect-level sentiment classification aims to accurately identify the emotional polarity of aspects in a sentence. In order to effectively model the dependencies between multi-aspects in one sentence, this paper proposes a graph convolution network (GCN) approach. First, the aspects are encoded with the context by attention mechanism. Then, the multi-aspects dependency graph is constructed from the dependency syntax tree, and GCN is applied on the graph to model the dependencies between multi-aspects in one sentence. Finally, sentiment classification is preformed using the aspect representation generated by the GCN. Experiments on the Restaurant and Laptop datasets of SemEval 2014 Task4 show that the proposed model achieves a significant improvement over the standard GCN models.
[1] Pang B, Lee L. Opinion mining and sentiment analysis[J]. Foundations and Trends in Information Retrieval, 2008, 2: 1-135. [2] Liu B. Sentiment analysis and opinion mining[J]. SynthLect Human Lang Technol, 2012, 5: 1-167. [3] Pontiki M, Galanis D, Pavlopoulos J, et al. SemEval-2014 task 4: aspect based sentiment analysis[C]//Proceedings of the 8th International Workshop on Semantic Evaluation, Dublin, 2014: 27-35. [4] Wagner J, Arora P, Cortes S, et al. DCU:aspect-based polarity classification for semeval task 4[C]//Proceedings of the 8th International Workshop on Semantic Evaluation, Dublin, 2014: 223-229. [5] Kiritchenko S, Zhu X D, Cherry C, et al. NRC-Canada-2014: detecting aspects and sentiment in customer reviews[C]//Proceedings of the 8th International Workshop on Semantic Evaluation, Dublin, 2014: 437-442. [6] Vo D, Zhang Y. Target-dependent twitter sentiment classification with rich automatic features[C]//Proceedings of the IJCAI, Buenos Aires, 2015: 1347-1353. [7] Kalchbrenner K N, Grefenstete E, Blunsom P. A convolutional neural network for modelling sentences[C]//Proceedings of the 52nd Annual Meeting of the Association for Computional Linguistics. Stroudsburg, PA: ACL, 2014: 655-665. [8] Mikolov T, Karafi’at M, Burget L, et al. Recurrent neural network based language model[C]//Proceedings of the Interspeech, Makuhari, 2010: 1045-1048. [9] Tang D Y, Qin B, Feng X C, et al. Effectivelstms for target-dependent sentiment classification[C]//Proceedings of the COLING, Osaka, 2016: 3298-3307. [10] Wang Y Q, Huang M L, Zhao L, et al. Attention-based lstm for aspect-level sentiment classification[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing, 2016: 606-615. [11] 梁斌,刘全,徐进,等.基于多注意力卷积神经网络的特定目标情感分析[J].计算机研究与发展, 2017, 54(8): 1724-1735. [12] Xue W, Li T. Aspect based sentiment analysis with gated convolutional networks[J]. arXiv:1805.07043, 2018. [13] Mnih V, Heess N, Graves A, et al. Recurrent models of visual attention[C]//Proceedings of the Advances in Neural Information Processing Systems 27, 2014: 2204-2212. [14] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[C]//Proceedings of the 3rd International Conference on Learning Representations, 2015. [15] Ma D H, Li S J, Zhang X D, et al. Interactive attention networks for aspect-level sentiment classification[C]//Proceedings of the IJCAI, Melbourne, 2017: 4068-4074. [16] Huang B X,Ou Y L, Carley K M. Aspect level sentiment classification with attention-over-attention neural networks[J]. ArXiv:1804.06536. 2018. [17] Tang D, Qin B, Liu T, Aspect level sentiment classification with deep memory network[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, Austin, Texas, USA, 2016, 214-224. [18] Chen P, Sun Z, Bing L, et al. Recurrent attention network on memory for aspect sentiment analysis[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2017: 452- 461. [19] SongY, Wang J, Jinag T, et al. Attentional encoder network for targeted sentiment classification[OL]. CoRR abs/1902.09314,2019. [20] Bruna J, Zaremba W, Szlam A, et al. Spectral networks and locally connected networks on graphs[C]//Proceedings of the 2nd International Conference on Learning Representations, 2014. [21] Marcheggiani D, Titov I. Encoding sentences with graph convolutional networks for semantic role labeling[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2017: 1506-1515. [22] Basting J, Titov I, Aziz W, et al. Graph convolutional encoders for syntax-aware neural machine translation[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2017: 1957-1967. [23] Li Y, Jin R, Luo Y. Classifying relations in clinical narratives using segment graph convolutional and recurrent neural networks, JAMIA,2019, 26 (3): 262-268. [24] Zhang C, Li Q, Song D. Aspect-based sentiment classification with aspect-specific graph convolutional networks[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019: 4560-4570. [25] Zhao P L, Hou L L, Wu O. Modeling sentiment dependencies with graph convolutional networks for aspect-level sentiment classification[J]. arXiv preprint arXiv:1906.04501, 2019. [26] Sun K, Zhang R, Mensah S et al. Aspect-level sentiment analysis via convolution over dependency tree[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019: 5683-5692. [27] Huang B, Carley K. Syntax-aware aspect level sentiment classification with graph attention networks[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019: 5472-5480. [28] Velickovic P, Cucurull G, Casanova A, et al. Graph attention networks[J]. arXiv preprint arXiv:1710.10903,2017. [29] Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[C]//Proceedings of the International Conference on Learning Representations,2017. [30] Pennington J,Socher R, Manning C D. Glove: global vectors for word representation[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing, Doha, 2014: 1532-1543. [31] Devlin J, Chang M, Lee K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171-4186.