基于多粒度用户偏好的文档级情感分析

陈洁,王思雨,赵姝,张燕平,余静莹

PDF(2602 KB)
PDF(2602 KB)
中文信息学报 ›› 2023, Vol. 37 ›› Issue (7) : 122-130.
情感分析与社会计算

基于多粒度用户偏好的文档级情感分析

  • 陈洁1,2,3,王思雨1,2,3,赵姝1,2,3,张燕平1,2,3,余静莹1,2,3
作者信息 +

Multi-Granular User Preferences for Document-Level Sentiment Analysis

  • CHEN Jie1,2,3, WANG Siyu1,2,3, ZHAO Shu1,2,3, ZHANG Yanping1,2,3, YU Jingying1,2,3
Author information +
History +

摘要

不同的用户通常具有多粒度的用户偏好,即用不同的用词习惯来表达情感(单词级用户偏好);在不同的句子上表达出不同的情感强度(句子级用户偏好);用不同的评分特征对产品进行评价(文档级用户偏好)。现有情感模型在文本特征表示时并未考虑用户偏好的多粒度性,据此,该文提出了一种融合多粒度用户偏好的情感分析模型。首先,在单词粒度上,将用户信息融合到注意机制中来获取基于用户偏好的句子表示;然后,在句子粒度上,结合自注意机制获取基于用户偏好的文档表示;最后,在文档粒度上,将得到的文档表示与用户信息直接进行融合得到文档的特征表示,并利用该表示进行分类。在IMDB、Yelp13、Yelp14三个文档级数据集上的实验表明,该模型能更有效地提升分类性能。

Abstract

Different users tend to have multi-granular user preferences, i.e. different usage in word choice, sentence formation and text organization. To fully consider the user information, this paper proposes a sentiment analysis model considering multi-granular user preferences. Firstly, in terms of word granularity, user information is integrated into the attention mechanism to obtain sentence representation; Then, on the granularity of sentence, the document representation is obtained by using the self-attention mechanism. Finally, in terms of document granularity, the document representation is directly fused with the user information for classification. Experimental results on three document-level datasets, IMDB, Yelp13, and Yelp14, show that the model can effectively improve classification performance.

关键词

情感分类 / 文档级评论 / 用户偏好 / 多粒度注意网络

Key words

sentiment classification / document-level reviews / user preferences / multi-granularity attention network

引用本文

导出引用
陈洁,王思雨,赵姝,张燕平,余静莹. 基于多粒度用户偏好的文档级情感分析. 中文信息学报. 2023, 37(7): 122-130
CHEN Jie, WANG Siyu, ZHAO Shu, ZHANG Yanping, YU Jingying. Multi-Granular User Preferences for Document-Level Sentiment Analysis. Journal of Chinese Information Processing. 2023, 37(7): 122-130

参考文献

[1] PANG B, LEE L. Opinion mining and sentiment analysis[J]. Foundations and Trends in Information Retrieval, 2008, 2(1-2): 1-135.
[2] YUAN C, MA Q, ZHOU W, et al. Jointly embedding the local and global relations of heterogeneous graph for rumor detection[C]//Proceedings of IEEE International Conference on Data Mining, 2019: 796-805.
[3] HYUN D, PARK C, YANG M C, et al. Review sentiment-guided scalable deep recommender system[C]// Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018: 965-968.
[4] WANG S, ZHU Y, GAO W, et al. Emotion-semantic-enhanced bidirectional LSTM with multi-head attention mechanism for microblog sentiment analysis[J]. Information, 2020, 11(5): 280.
[5] MI C, RUAN X, XIAO L. Microblog sentiment analysis using user similarity and interaction-based social relations[J].International Journal of Web Services Research, 2020, 17(3): 39-55.
[6] PANG B, LEE L, VAITHYANATHAN S. Thumbs up?Sentiment classification using machine learning techniques[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2002: 79-86.
[7] QU L, IFRIM G, WEIKUM G. The bag-of-opinions method for review rating prediction from sparse text patterns[C]//Proceedings of the 23rd International Conference on Computational Linguistics. 2010: 913-921.
[8] DING X, LIU B, YU PS. A holistic lexicon-based approach to opinion mining[C]//Proceedings of the International Conference on Web Search and Data Mining, 2008: 231-240.
[9] KRIZHEVSKY A, SUTSKEVER I, HINTON GE. Imagenet classification with deep convolutional neural networks[C]//Proceedings of the Conference on Neural Information Processing Systems, 2012: 1106-14.
[10] DAHL GE, YU D, DENG L, et al. Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition[J]. IEEE Trans Speech Audio Process, 2012, 20(1): 30-42.
[11] COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011, 12: 2493-2537.
[12] LI Y, ZHANG K, WANG J, et al. A cognitive brain model for multimodal sentiment analysis based on attention neural networks[J]. Neurocomputing, 2021, 430: 159-173.
[13] SADR H, PEDRAM MM, TESHNEHLAB M. Multi-view deep network: A deep model based on learning features from heterogeneous neural networks for sentiment analysis[J]. IEEE Access. 2020, 8: 86984-86997.
[14] ITO T, TSUBOUCHI K, SAKAJI H, et al. Contextual sentiment neural network for document sentiment analysis[J]. Data Science and Engineering, 2020,5(2): 180-192.
[15] ABID F, LI C, ALAM M. Multi-source social media data sentiment analysis using bidirectional recurrent convolutional neural networks[J]. Computer Communications, 2020,157: 102-115.
[16] YANG Z, YANG D, DYER C, et al. Hierarchical attention networks for document classification[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, 2016: 1480-9.
[17] SOCHER R, PERELYGIN A, WU J, et al. Recursive deep models for semantic compositionality over a sentiment treebank[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2013: 1631-1642.
[18] RAO G, HUANG W, FENG Z, et al. LSTM with sentence representations for document-level sentiment classification[J]. Neurocomputing, 2018, 308: 49-57.
[19] KONG L, LI C, GE J, et al. Leveraging multiple features for document sentiment classification[J]. Information Sciences, 2020, 518: 39-55.
[20] LIU F, ZHENG L, ZHENG J. HieNN-DWE: A hierarchical neural network with dynamic word embeddings for document level sentiment classification[J]. Neurocomputing, 2020, 403: 21-32.
[21] RHANOUI M, MIKRAM M, YOUSFI S, et al. A CNN-BiLSTM model for document-level sentiment analysis[J]. Machine Learning and Knowledge Extraction, 2019, 1(3): 832-847.
[22] LIU F, ZHENG J, ZHENG L, et al. Combining attention-based bidirectional gated recurrent neural network and two-dimensional convolutional neural network for document-level sentiment classification[J]. Neurocomputing, 2020, 371:39-50.
[23] GAO W, YOSHINAGA N, KAJI N, et al. Modeling user leniency and product popularity for sentiment classification[C]//Proceedings of the 6th International Joint Conference on Natural Language Processing, 2013: 1107-1111.
[24] TANG D, QIN B, LIU T. Learning semantic representations of users and products for document level sentiment classification[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, 2015: 1014-1023.
[25] DOU Z-Y. Capturing user and product information for document level sentiment analysis with deep memory network[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2017: 521-526.
[26] LI J, YANG H, ZONG C. Document-level multi-aspect sentiment classification by jointly modeling users, aspects, and overall ratings[C]//Proceedings of the 27th International Conference on Computational Linguistics, 2018: 925-936.
[27] CHEN J, YU J, ZHAO S, et al. User's review habits enhanced hierarchical neural network for document-level sentiment classification[J]. Neural Processing Letters. 2021, 53(3): 2095-2111.
[28] CHEN T, XU R, HE Y, et al. Learning user and product distributed representations using a sequence model for sentiment analysis[J]. IEEE Computational Intelligence Magazine, 2016,11 (3): 34-44.
[29] WU Z, DAI X-Y, YIN C, et al. Improving review representations with user attention and product attention for sentiment classification[C]//Proceedings of the 32nd AAAI Conference on Artificial Intelligence, 2018: 5989-5996.
[30] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the Annual Conference on Neural Information Processing Systems, 2017: 5998-6008.
[31] KIRITCHENKO S, ZHU X, MOHAMMAD SM. Sentiment analysis of short informal texts[J]. Journal of Artificial Intelligence Research, 2014, 50: 723-62.

基金

国家社会科学基金(18ZDA032)
PDF(2602 KB)

666

Accesses

0

Citation

Detail

段落导航
相关文章

/