Attention Enhanced Bi-directional LSTM for Sentiment Analysis
GUAN Pengfei1, LI Bao‘an1, LV Xueqiang1, ZHOU Jianshe2
1.Beijing Key Laboratory of Internet Culture & Digital Dissemination Research, Beijing Information Science & Technology University, Beijing 100101, China; 2.Beijing Advanced Innovation Center for Imaging Technology, Capital Normal University, Beijing 100048, China
Abstract:To deal with sentiment analysis at the sentence level, this paper proposes a method of attention enhanced Bi-directional LSTM. It employs attention mechanism to learn every word weight distribution of sentiment tendency directly from the word vector. Tested on the NLPCC 2014 sentiment analysis dataset, the results of the model outperfroms the other sentence level sentiment classification model.
[1] 李然,林政,林海伦,等. 文本情绪分析综述[J]. 计算机研究与发展,2018,55(1): 30-52. [2] 吴江,唐常杰,李太勇,等. 基于语义规则的Web金融文本情感分析[J]. 计算机应用,2014,34(2): 481-485. [3] Taboada M,et al. Lexicon-based methods for sentiment analysis[J].Computational Linguistics,2011,37(2): 267-307. [4] 李婷婷,姬东鸿. 基于SVM和CRF多特征组合的微博情感分析[J]. 计算机应用研究,2015,32(4): 978-981. [5] Pu X,Wu G,Yuan C. Exploring overall opinions for document level sentiment classification with structural SVM[J]. Multimedia Systems,2017(1-2): 1-13. [6] 梁军,等. 基于深度学习的微博情感分析[J]. 中文信息学报,2014,28(5): 155-161. [7] Kim Y. Convolutional neural networks for sentence classification[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing, 2014:1746-1751 [8] Cao Y,Xu R,Chen T. Combining convolutional neural network and support vector machine for sentiment classification[C]//Proceedings of Chinese National Conference on Social Media Processing,2015:144-155. [9] 冯兴杰,张志伟,史金钏.基于卷积神经网络和注意力模型的文本情感分析[J/OL].计算机应用研究,2018,35(05): 1434-1436. [10] Qian Q,et al. Linguistically regularized LSTM for sentiment classification[C]//Proceedings of Meeting of the Association for Computational Linguistics, 2017: 1679-1689. [11] 张仰森,等. 基于双重注意力模型的微博情感分析方法[J]. 清华大学学报(自然科学版) 2018,58(02): 122-130. [12] 胡朝举,梁宁. 基于深层注意力的LSTM的特定主题情感分析[J]. 计算机应用研究,2019,36(05).[doi:10.3969/j.issn.1001-3695.2017.11.0736] [13] Mikolov T,et al. Efficient estimation of word representations in vector space[C]//Proceedings of the International Conference on Learning Representations, 2013. [14] Mikolov T,Sutskever I,Chen K,et a1. Distributed representations of words and phrases and their compositionality[C]//Proceedings of Advances in Neural Information Processing Systems,2013:3111-3119. [15] Hochreiter S,Schmidhuber J. Long short-term me-mory[J]. Neural Computation,1997,9(8): 1735-1780. [16] Bahdanau D,Cho K,Bengio Y. Neural machine translation by jointly learning to align and translate[C]//Proceedings of the International Conference on Learning Representations,2015. [17] Wang Y,et al. Attention-based LSTM for aspect-level sentiment classification[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing, 2016: 606-615.