面向方面级情感分析的加权依存树卷积网络

杨春霞,宋金剑,姚思诚

PDF(3029 KB)
PDF(3029 KB)
中文信息学报 ›› 2022, Vol. 36 ›› Issue (5) : 125-132.
情感分析与社会计算

面向方面级情感分析的加权依存树卷积网络

  • 杨春霞1,2,3,宋金剑1,2,3,姚思诚1,2,3
作者信息 +

A Weighted Dependency Tree Convolutional Networks for Aspect-Based Sentiment Analysis

  • YANG Chunxia1,2,3, SONG Jinjian1,2,3, YAO Sicheng1,2,3
Author information +
History +

摘要

面向方面级情感分析,现有基于规则的依存树修剪方法存在删除部分有用信息的问题。另外,如何利用图卷积网络获取图结构中丰富的全局信息也是现阶段面临的一个重要问题。针对第一个问题,该文通过多头注意力机制自动学习如何有选择地关注对分类任务有用的结构信息,将原始依存树转变为完全连接的边加权图。针对第二个问题,该文将紧密连接引入图卷积网络中,使图卷积网络能够捕捉丰富的局部和全局信息。三个公开数据集上的实验结果表明,该文模型相比基线模型其准确率和F1值均有提升。

Abstract

For aspect-based sentiment analysis, existing rule-based dependency tree pruning methods have the problem of deleting some useful information. In addition, how to use the graph convolutional network to obtain the rich global information in the graph structure is also an important problem at present. For the first problem, we use the multi-head attention mechanism to automatically learn how to selectively focus on the structural information that is useful for the classification task, and transform the original dependency tree into a fully connected edge weighted graph.To solve the second problem, we paper introduces dense connections into the graph convolutional network, so that the graph convolutional network can capture rich local and global information. The experimental results on the three public datasets show that the accuracy and F1 of the proposed model are both improved compared with the baseline model.

关键词

依存树 / 多头注意力 / 紧密连接 / 图卷积网络

Key words

dependency tree / multi-head attention / dense connections / graph convolutional networks

引用本文

导出引用
杨春霞,宋金剑,姚思诚. 面向方面级情感分析的加权依存树卷积网络. 中文信息学报. 2022, 36(5): 125-132
YANG Chunxia, SONG Jinjian, YAO Sicheng. A Weighted Dependency Tree Convolutional Networks for Aspect-Based Sentiment Analysis. Journal of Chinese Information Processing. 2022, 36(5): 125-132

参考文献

[1] Yang J, Yang R, Lu H, et al. Multi-entity aspect-based sentiment analysis with context, entity, aspect memory and dependency information[J]. ACM Transactions on Asian and Low-Resource Language Information Processing. 2019,28(4): 1-22.
[2] Phan M H, Ogunbona P O. Modelling context and syntactical features for aspect-based sentiment analysis[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics,2020: 3211-3220.
[3] Wang K, Shen W, Yang Y, et al. Relational graph attention network for aspect-based sentiment analysis[C]//Processding of the 58th Annual Meeting of the Association for Computational Linguisties.2020: 3229-3238.
[4] Zhang C, Li Q, Song D. Aspect-based sentiment classification with aspect-specific graph convolutional networks[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, Hong Kong: ACL, 2019: 4560-4570.
[5] Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[C]//Proceedings of the 5th International Conference on Learning Representations. Toulon, France, 2017.
[6] Li Q, Han Z, Wu X M. Deeper insights into graph convolutional networks for semi-supervised learning[C]//Proceedings of the 32nd AAAI Conference on Artificial Intelligence. Association for the Advancement of Artificial Intelligence, 2018: 3538-3545.
[7] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proceedings of the Advances in Neural Information Processing Systems, 2017: 5998-6008.
[8] Huang G, Liu Z, Van D M L, et al. Densely connected convolutional networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, Hawaii: IEEE, 2017: 4700-4708.
[9] 刘倩,李宁,田英爱. 面向机器学习的流式文档逻辑结构标注方法研究[J]. 中文信息学报, 2019, 33(9): 50-59,78.
[10] 热西旦木·吐尔洪太,吾守尔·斯拉木,伊尔夏提·吐尔贡. 词典与机器学习方法相结合的维吾尔语文本情感分析[J]. 中文信息学报, 2017, 31(1): 177-183.
[11] Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015: 1556-1566.
[12] Miwa M, Bansal M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for ComputationalLinguistics, Berlin, Germany: ACL, 2016: 1105-1116.
[13] Zhang Y, Qi P, Manning C D. Graph convolution over pruned dependency trees improves relation extraction[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium: ACL, 2018: 2205-2215.
[14] Xu K, Li C, Tian Y, et al. Representation learning on graphs with jumping knowledge networks[C]//Proceedings of the International Conference on Machine Learning, 2018: 5453-5462.
[15] Pennington J, Socher R, Manning C D. Glove: global vectors for word representation[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, Doha, Qatar: ACL, 2014: 1532-1543.
[16] Kiritchenko S, Zhu X, Cherry C, et al. NRC-Canada-2014: detecting aspects and sentiment in customer reviews[C]//Proceedings of the 8th International Workshop on Semantic Evaluation, Dublin, Ireland: ACL, 2014: 437-442.
[17] Song Y, Wang J, Jiang T, et al. Attentional encoder network for targeted sentiment classification[J],CoRR, 2019, abs/1902.09314.
[18] Zeng B, Yang H, Xu R, et al. LCF: a local context focus mechanism for aspect-based sentiment classification[J] Applied sciences, 2019, 9(16): 3389.
[19] He R, Lee W S, Ng H T, et al. Effective attention modeling for aspect-level sentiment classification[C]//Proceedings of the 27th International Conference on Computational Linguistics, USA, 2018: 1121-1131.

基金

国家自然科学基金(51705260、61273229);南京信息工程大学国家社会科学重大项目培育项目
PDF(3029 KB)

1398

Accesses

0

Citation

Detail

段落导航
相关文章

/