基于UIE的情感可解释分析

朱杰,刘苏文,李军辉,郭立帆,曾海峰,陈风

PDF(1559 KB)
PDF(1559 KB)
中文信息学报 ›› 2023, Vol. 37 ›› Issue (11) : 151-157.
情感分析与社会计算

基于UIE的情感可解释分析

  • 朱杰1,刘苏文1,李军辉2,郭立帆1,曾海峰1,陈风1
作者信息 +

Interpretable Sentiment Analysis Based on UIE

  • ZHU Jie1, LIU Suwen1, LI Junhui2, GUO Lifan1, ZENG Haifeng1, CHEN Feng1
Author information +
History +

摘要

情感可解释分析是情感分析领域中一个新颖的任务,旨在判断文本极性,同时还需模型给出判断所依据的证据。现有的情感分析方法大多是黑盒模型,其内部决策机制对用户是不透明的。近年来,尽管模型可解释性受到越来越多的关注,但由于缺少人工标注的评测数据,可解释评估仍旧是一个亟待解决的问题。该文提出了一个基于 UIE (Universal Information Extraction)的情感可解释分析方法,该方法根据情感可解释任务的特点,使用小样本学习、文本聚类等技术,提高了模型的合理性、忠诚性。实验结果表明,该方法在“2022语言与智能技术竞赛: 情感可解释评测”任务上获得了第一名的成绩。

Abstract

Interpretable sentiment analysis aims to judge the polarity of text, and at the same time, give evidence for judgements or evidence for predictions. Most of the existing sentiment analysis methods are black box models, and interpretability evaluation is still a problem to be solved. This paper proposes a interpretable sentiment analysis method based on UIE. According to the characteristics of sentiment interpretable tasks, this method uses methods such as few-shot and text clustering to improve the rationality and loyalty of the model. The experimental results show that this method has won the first place in the task of “2022 language and intelligent technology competition: sentiment interpretable evaluation”.

关键词

情感可解释分析 / UIE / 少样本学习

Key words

sentiment interpretable analysis,UIE / few-shot learning

引用本文

导出引用
朱杰,刘苏文,李军辉,郭立帆,曾海峰,陈风. 基于UIE的情感可解释分析. 中文信息学报. 2023, 37(11): 151-157
ZHU Jie, LIU Suwen, LI Junhui, GUO Lifan, ZENG Haifeng, CHEN Feng. Interpretable Sentiment Analysis Based on UIE. Journal of Chinese Information Processing. 2023, 37(11): 151-157

参考文献

[1] 李明,胡吉霞,侯琳娜,等.商品评论情感倾向性分析[J].计算机应用,2019,39(S02):15-19.
[2] 张琰,黄霁风.基于 PMI的豆瓣电影评论文本情感分析[J].现代计算机,2019(12):37-40.
[3] MAJUMDER N,PORIA S,HAZARIKA D,et al. Dialogue rnn: Anattentive rnn for emotion detection in conversations[C]//Proceedings of the AAAI Conferenceon Artificial Intelligence, 2019, 33: 6818-6825.
[4] HUANG C,TRABELSI A,ZAANE O R. ANA at SemEval 2019 task3: Contextual emotion detection in conversations through hierarchical LSTMs and BERT[C]//Proceedings of the SemEval International Workshop on Semantic Evaluation,2019:49-53.
[5] LU Y, LIU Q, DAI D, et al. Unified structure generation for universal information extraction [C]//Proceedings of ACL,2022:5755-5772.
[6] 李婷婷,姬东鸿.基于 SVM 和 CRF 多特征组合的微博情感分析[J].计算机应用研究,2015,32(4):978-981.
[7] 苏莹,张勇,胡珀,等.基于朴素贝叶斯与潜在狄利克雷分布相结合的情感分析[J].计算机应用,2016,36(6):1613-1618.
[8] ZHOU Z H, FENG J.DeepForest: Towards an alternative to deep neural networks[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence, 2017: 3553-3559.
[9] KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing. NewYork: ACM, 2014: 1746-1751.
[10] SOCHER R, PENNINGTON J, HUANG E H, et al. Semi-supervised recursive auto encoders for predicting sentiment distributions [C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing,2011: 151-161.
[11] WANG X, LIU Y, SUN C J, et al. Predicting polarities of tweets by composing word embeddings with long short-term memory[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference On Natural Language Processing,2015, 1:1343-1353.
[12] CHO K, VAN MERRIENBOER B, GULCEHRE C,et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation [C]//Proceedings of the EMNLP, 2014:1724-1734.
[13] VASWANI A,SHAZEER N,PARMAR N,et al. Attention is all you need [C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 6000-6010.
[14] 赵亚欧,张家重,李贻斌,等. 基于ELMo和Transformer混合模型的情感分析[J]. 中文信息学报, 2021, 35(3): 115-124.
[15] LIU C Y, SUN W B, CHAO W H, et al. Convolution neural network for relation extraction[C]//Proceedings of the International Conference on Advanced Data Mining and Aoolications. Springer, Berline, Heidelberg, 2013: 231-242.
[16] ZENG D, LIU K, LAI S, et al. Relation classification via convolution deep neural network[C]//Proceedings of the COLING, 2014:2335-2344.
[17] XU Y, MOU L, LI G, et al. Classifying relations via long short-term memory networks along shortest dependency paths[C]//Proceedings of EMNLP, 2015: 1785-1794.
[18] ZHANG S, ZHENG D, HU X, et al. Bidirectional long short-term memory networks for relation classification[C]//Proceedings of ACL, 2015: 73-78.
[19] LEE J, SEO S, CHOI Y S. Semantic relation classification via bidirectional LSTM networks with entity-aware attention using latent entity typing[J]. arXivpreprint arXiv:1901.08163, 2019.
[20] PETERS M E, NEUMANN M, IYYER M, et al. Deep contextualized word representations [C]//Proceedings of the NAACL, 2018:2-7.
[21] DEVLIN J, CHANG M W, LEE K, et al. BERT: Pretraining of deep bidirectional transformers for language understanding[C]//Proceedings of the NAACL, 2019, 4171-4186.
[22] SUN Y, WANG S, FENG S, et al. Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation[J].arXiv preprint arXiv:2107.02137, 2021.

基金

国家自然科学基金(61876120)
PDF(1559 KB)

Accesses

Citation

Detail

段落导航
相关文章

/