基于比较学习的情感分析方法研究

林原,李家平,杨亮,赵昕航,覃雪,许侃,林鸿飞

PDF(2398 KB)
PDF(2398 KB)
中文信息学报 ›› 2023, Vol. 37 ›› Issue (2) : 129-137.
情感分析与社会计算

基于比较学习的情感分析方法研究

  • 林原1,李家平2,杨亮2,赵昕航1,覃雪3,许侃2,林鸿飞2
作者信息 +

Sentiment Analysis Based on Comparing to Learn

  • LIN Yuan1, LI Jiaping2, YANG Liang2, ZHAO Xinhang1, QIN Xue3,XU Kan2, LIN Hongfei2
Author information +
History +

摘要

情感分析是指通过提取文本特征对基于文本的情感倾向进行分类,是自然语言处理领域中的一项重要任务。为了提高文本情感分析的性能,该文将分类任务定义为一个比较问题,并提出了基于比较学习(Comparing to Learn,C2L)的分类模型。C2L的目标是通过与已标注的样本进行比较来对句子进行评分,实际上,通过比较方法进行分类比训练一个过于复杂的模型更加有效。在两个常用数据集上的实验结果表明,C2L的性能优于现有的许多模型;通过调整C2L的内部结构,可以在不同的任务和数据集上取得好的表现。

Abstract

Sentiment analysis refers to the classification of sentiment tendencies in a text. This paper defines the classification task as a comparison problem, and proposes a classification model based on comparative learning (Comparing to Learn, C2L). The goal of C2L is to score sentences by comparing with labeled samples. In fact, classification by comparison is more effective than training an overly complex model. The experimental results of two commonly used datasets show that the performance of C2L is better than many existing models.

关键词

情感分析 / 深度学习 / 比较学习

Key words

sentiment analysis / deep learning / comparing to learn

引用本文

导出引用
林原,李家平,杨亮,赵昕航,覃雪,许侃,林鸿飞. 基于比较学习的情感分析方法研究. 中文信息学报. 2023, 37(2): 129-137
LIN Yuan, LI Jiaping, YANG Liang, ZHAO Xinhang, QIN Xue,XU Kan, LIN Hongfei. Sentiment Analysis Based on Comparing to Learn. Journal of Chinese Information Processing. 2023, 37(2): 129-137

参考文献

[1] YI J, NASUKAWA T. Sentiment analysis: Capturing favorability using natural language processing [C]//Proceedings of the 2nd International Conference on Knowledge Capture. ACM, 2003: 70-77.
[2] 林原,李家平,许侃,等.基于多头注意力的双向LSTM情感分析模型研究[J].山西大学学报(自然科学版),2020,43(01): 1-7.
[3] 卢秋萍. 比较思维的方法论探讨[D].武汉: 华中师范大学硕士学位论文,2008.
[4] SMELSER N J. Comparative methods in the social sciences [M]. Quid Pro Books, 2013.
[5] LIU B. Sentiment analysis and opinion mining [J]. Synthesis Lectures on Human Language Technologies, 2012, 5(1): 1-167.
[6] 陆文星,王燕飞.中文文本情感分析研究综述[J].计算机应用研究,2012,29(06): 2014-2017.
[7] 曹轩,许青林.融合注意力与比较增强学习机制的情感分析[J].软件导刊,2022,21(01): 124-130.
[8] JANYCE W, THERESA W, CLAIRE C. Annotating expressions of opinions and emotions in language [J]. Language Resources and Evaluation, 2005, 39 (2-3): 165-210.
[9] PANG B, LEE L, VAITHYANATHAN S. Thumbs up?: Sentiment classification using machine learning techniques [C]//Proceedings of the ACL Conference on Empirical Methods in Natural Language Processing. Philadelphia, USA, 2002: 79-86.
[10] KIM Y. Convolutional neural networks for sentence classification [J]. arXiv preprint arXiv: 1408.5882, 2014.
[11] BRUECKNER R, SCHULLER B, Social signal classication using deep BLSTM recurrent neural networks [C]//Proceedings of IEEE Internet Conference Acoustic, Speech Signal Process, 2014: 4823-4827.
[12] TANG D, QIN B, FENG X, et al. Effective LSTMs for target-dependent sentiment classification [C]//Proceedings of the 26th International Conference on Computational Linguistics, 2016: 3298-3307.
[13] WANG Y, HUANG M, ZHAO L. Attention-based LSTM for aspect-level sentiment classification [C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2016: 606-615.
[14] SHEN Y, TAN S, SORDONI A, et al. Ordered neurons: Integrating tree structures into recurrent neural networks [C]//Proceedings of the International Conference on Learning Representations, 2018.
[15] DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training ofdeep bidirectional transformers for language understanding [C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171-4186.
[16] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space [J]. arXiv preprint arXiv: 1301.3781, 2013.
[17] HOCHREITER, Schmidhuber. Long short-term memory [J]. Neural Computation,1997,9(8): 1735-1780.
[18] ANDREW L M, RAYMOND E D, PETER T, et al. Learning word vectors for sentiment analysis [C]//Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, 2011: 142-150.
[19] ROSENTHAL S, FARRA N, NAKOV P. SemEval-2017 task 4: Sentiment analysis in Twitter [C]//Proceedings of the 11th International Workshop on Semantic Evaluation, 2017: 502-518.
[20] CHILKURI N, ELIASMITH C. Parallelizing legendre memory unit training[C]//Proceedings of the 38th International Conference on Machine Learning, 2021: 1898-1907.
[21] HASANI R, LECHNER M, AMINI A, et al. Closed-formcontinuous-depth models[J]. arXiv preprint arXiv: 2106.13898, 2021.
[22] VOLODYMYR M, NICOLAS H, ALEX G. Recurrent models of visual attention [C]//Proceedings of the 27th International Conference on Neural Information Processing Systems, 2014: 2204-2212.
[23] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate [C]//Proceedings of the 3rd International Conference on Learning Representations, ICLR, 2015.
[24] KINGMA D, BA J. Adam: A method forstochastic optimization [C]//Proceedings of the International Conference on Learning Representations, 2015.
[25] CHARLES X L, JIN H, HARRY Z. AUC: A statistically consistent and more discriminating measure than accuracy [C]//Proceedings of the International Joint Conferences on Artificial Intelligence Organization, 2003, 3: 519-524.

基金

国家自然科学基金(61976036,61772103);中央高校基本科研业务费项目(DUT22RW110)
PDF(2398 KB)

Accesses

Citation

Detail

段落导航
相关文章

/