基于联合学习的问答情感分类方法

安明慧,沈忱林,李寿山,李逸薇

PDF(1673 KB)
PDF(1673 KB)
中文信息学报 ›› 2019, Vol. 33 ›› Issue (10) : 119-126.
情感分析与社会计算

基于联合学习的问答情感分类方法

  • 安明慧1,沈忱林1,李寿山1,李逸薇2
作者信息 +

Joint Learning for Sentiment Classification Towards Question-Answering Reviews

  • AN Minghui1, SHEN Chenlin1, LI Shoushan1, LEE Sophia Yat Mei2
Author information +
History +

摘要

面向问答型评论的情感分类在情感分析领域是一项新颖且极具挑战性的研究任务。由于问答型评论情感分类标注数据非常匮乏,基于监督学习的情感分类方法的性能有一定限制。为了解决上述困境,该文提出了一种基于联合学习的问答情感分类方法。该方法通过大量自然标注普通评论辅助问答情感分类任务,将问答情感分类作为主任务,将普通评论情感分类作为辅助任务。具体而言,首先通过主任务模型单独学习问答型评论的情感信息;其次,使用问答型评论和普通评论共同训练辅助任务模型,以获取问答型评论的辅助情感信息;最后通过联合学习同时学习和更新主任务模型及辅助任务模型的参数。实验结果表明,基于联合学习的问答情感分类方法能较好融合问答型评论和普通评论的情感信息,大幅提升问答情感分类任务的性能。

Abstract

Sentiment classification towards Question-Answering reviews is a novel and challenging task in sentiment analysis community. However, due to the limited annotation corpus for QA sentiment classification, it is difficult to achieve significant improvement via supervised approaches. To overcome this problem, we propose a joint learning approach for QA sentiment classification, which treats QA sentiment classification as the main task while traditional review sentiment classification as the auxiliary task. In detail, we first encode QA review into a sentiment vector with main task model. Then, we propose an auxiliary task model to learn auxiliary QA sentiment information representation with the help of traditional review. Finally, we update the parameters both in main task model and auxiliary task model simultaneously through joint learning. Empirical results demonstrate the impressive effectiveness of the proposed joint learning approach in contrast to a number of state-of-the-art baselines.

关键词

情感分类 / 问答文本 / 联合学习

Key words

sentiment classification / question-answering text / joint learning

引用本文

导出引用
安明慧,沈忱林,李寿山,李逸薇. 基于联合学习的问答情感分类方法. 中文信息学报. 2019, 33(10): 119-126
AN Minghui, SHEN Chenlin, LI Shoushan, LEE Sophia Yat Mei. Joint Learning for Sentiment Classification Towards Question-Answering Reviews. Journal of Chinese Information Processing. 2019, 33(10): 119-126

参考文献

[1] 吴斐, 张玉红, 胡学钢. 面向评论信息的跨领域词汇情感倾向判别方法[J]. 计算机科学, 2015, 42(06): 220-222,238.
[2] 唐存琛, 毕翔. 国内外网络舆情分析比较研究[J]. 西南民族大学学报, 2018, 39(9): 141-147.
[3] Pang B, Lee L, Vaithyanathan S. Thumbs up? Sentiment classification using machine learning techniques.[C]//Proceedings of the ACL-02 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2002:79-86.
[4] Chen Y, Hou W J, Cheng X Y, et al. Joint learning for emotion classification and emotion cause detection[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018:646-651.
[5] Ma D H, Li S J, Wang H F. Joint learning for targeted sentiment analysis[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018:4737-4742.
[6] Shen C L, Sun C L, Wang J J, et al. Sentiment classification towards question-answering with hierarchical matching network. [C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018:3654-3663.
[7] 严倩, 陈敬, 王礼敏,等. 基于联合学习的跨语言事件识别方法[J]. 郑州大学学报, 2017, 49(2): 60-65.
[8] 邱盈盈, 洪宇, 周文瑄,等. 面向事件抽取的深度与主动联合学习方法[J].中文信息学报, 2018, 32(06): 98-106.
[9] Wang J J, Li S S, Zhou G D. Joint Learning on relevant user attributes in micro-blog. [C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. 2017:4130-4136.
[10] Li Q, Ji H, Huang L. Joint event extraction via structured prediction with global features.[C]//Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. 2013, 1:73-82.
[11] Tu M S, Zhao X M. CCHAN: An end to end model for cross domain sentiment classification [J].IEEE Access, 2019, 7:50232-50239.
[12] Cong D W, Yuan J H, Zhao YY. A joint model for sentiment classification and opinion words extraction.[M].Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data. Springer, Cham, 2018:337-347.
[13] Wang Y Q, Huang M L, Zhao L, et al. Attention-based LSTM for aspect-level sentiment classification.[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016:606-615.
[14] Kingma D P, Ba J. Adam: A method for stochastic optimization [J]. arXiv preprint arXiv: 1412. 6980, 2014.
[15] Srivastava N, Hinton G, Krizhevsky A, et al. Dropout: A Simple way to prevent neural networks from overfitting [J].Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
[16] Chen Q, Zhu X D, Liang Z H, et al. Enhancing and combining sequential and tree LSTM for natural language inference[J].ariXiv preprint ariXiv: 1609. 06038, 2016.
[17] Huang S, Wang H F. Bi-LSTM neural networks for Chinese grammatical error diagnosis[C]//Proceedings of the 3rd Workshop on Natural Language Processing Techniques for Educational Applications. 2016:148-154.
[18] Lei T, Barzilay R, Jaakkola T. Molding CNNs for Text: Non-linear, Non-consecutive Convolutions[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015:1565-1575.
[19] Wang Z G, Hamza W, Florian R. Bilateral multi-perspective matching for natural language sentences[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. 2017:4144-4150.

基金

国家自然科学基金(61331011,61375073)
PDF(1673 KB)

722

Accesses

0

Citation

Detail

段落导航
相关文章

/