基于终身朴素贝叶斯方法的情感分类

王松,买日旦·吾守尔,古兰拜尔·吐尔洪,段淑敏

PDF(4076 KB)
PDF(4076 KB)
中文信息学报 ›› 2023, Vol. 37 ›› Issue (9) : 150-160.
情感分析与社会计算

基于终身朴素贝叶斯方法的情感分类

  • 王松,买日旦·吾守尔,古兰拜尔·吐尔洪,段淑敏
作者信息 +

Sentiment Classification Based on Lifelong Naive Bayesian

  • WANG Song, MAIRIDAN Wushouer, GULANBAIER Tuerhong, DUAN Shumin
Author information +
History +

摘要

面对快速增长和变化的网络信息,现有情感分类模型受灾难遗忘的影响,在训练完成后难以累积和迁移知识。受到人类持续学习过程的启发,尝试用终身学习解决这一问题。该文讨论终身学习概念并为终身学习范式构建多任务情感分类数据集;将终身朴素贝叶斯方法用于中文情感分类任务: 把不同种类的商品评论视为不同的情感分类任务,LNB算法依次学习并为每一个任务构建分类器。利用领域注意力机制改进LNB,提出该文方法LNB-DA。实验表明该文方法具备知识累积与迁移的能力,负类F1值优于对比方法0.15%~23.41%。

Abstract

The existing sentiment classification approach is defected in incapability of knowledge accumulation and transferrin. Lifelong learning (LL), a new learning paradigm, tries to address this problem by simulating the human behavior of continual learning. This paper discusses general knowledge of lifelong learning and builds a new Chinese sentiment classification dataset for LL setting. We apply lifelong naive Bayesian (LNB) to Chinese sentiment classification tasks, where the pre-defined classification tasks are learned sequentially to build a classifier for each category of product reviews, and the LNB is enhanced by the domain attention mechanism. Experimental results show that our approach Lifelong Nave Bayesian with Domain Attention (LNB-DA) outperforms baselines by 0.15%-23.41% according to F1-negative score.

关键词

终身学习 / 持续学习 / 朴素贝叶斯 / 情感分类

Key words

lifelong learning / continual learning / nave Bayesian / sentiment classification

引用本文

导出引用
王松,买日旦·吾守尔,古兰拜尔·吐尔洪,段淑敏. 基于终身朴素贝叶斯方法的情感分类. 中文信息学报. 2023, 37(9): 150-160
WANG Song, MAIRIDAN Wushouer, GULANBAIER Tuerhong, DUAN Shumin. Sentiment Classification Based on Lifelong Naive Bayesian. Journal of Chinese Information Processing. 2023, 37(9): 150-160

参考文献

[1] 邱锡鹏.神经网络与深度学习[M].北京:机械工业出版社,2020: 4-8.
[2] CHEN, LIU. Lifelong machine learning[J]. Synthesis Lectures on Artificial Intelligence and Machine Learning, 2016,10(3): 1-145.
[3] THRUN S, MITCHELL TM. Lifelong robot learning[J]. Robotics and Autonomous Systems, 1995, 15(1-2): 25-46.
[4] XIA, JIANG, HE H. Distantly supervised lifelong learning for large-scale social media sentiment analysis[J]. IEEE Transactions on Affective Computing, 2017, 8(4): 480-491.
[5] CHEN, MA, LIU. Lifelong learning for sentiment classification[C]//Proceedings of ACL. Beijing, China: Association for Computational Linguistics, 2018, 2: 750-756.
[6] HONG, PAL, GUAN S-U, et al. Semi-unsupervised lifelong learning for sentiment classification [C]//Proceedings of HPCCT. New York, NY, USA: Association for Computing Machinery, 2019: 87-92.
[7] WANG, LIU, WANG, et al. Forward and backward knowledge transfer for sentiment classification[C]//Proceedings of ACML. Nagoya, Japan: PMLR, 2019, 101: 457-472.
[8] PAN SJ, YANG Q. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359.
[9] LOMONACO V. Continual learning with deep architectures[D]. Alma Mater Università di Bologna, 2019.
[10] CARUANA R. Multitask learning[J]. Machine Learning, 1997, 28(1): 41-75.
[11] SHALEV-SHWARTZ S. Online learning and online convex optimization[J]. Foundations and Trends in Machine Learning, 2012, 4(2): 107-194.
[12] KE Z, LIU B, WANG H, et al. Continual learning with knowledge transfer for sentiment classification[C]//Proceedings of ECML-PKDD, 2020.
[13] MCCLOSKEY M. Catastrophic interference in connectionist networks[J]. Psychology of Learning & Motivation, 1989, 24: 109-165.
[14] GOODFELLOW I J, MIRZA M, XIAO D, et al. An empirical investigation of catastrophic forgetting in gradient-based neural networks[J/OL]. arXiv preprint arXiv:1312.6211 [cs, stat], 2015.
[15] REBUFFI S, KOLESNIKOV A, SPERL G, et al. iCaRL: incremental classifier and representation learning[C]//Proceedings of CVPR, 2017: 5533-5542.
[16] SHIN H, LEE JK, KIM J, et al. Continual learning with deep generative replay[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems 2017: 2994-3003.
[17] LOPEZ-PAZ D, RANZATO M. Gradient episodic memory for continual learning[C]//Proceedings of NIPS. Long Beach, California, USA: Curran Associates Inc., 2017: 6470-6479.
[18] LI Z, HOIEM D. Learning without forgetting[C]//Proceedings of ECCV. Cham: Springer International Publishing, 2016: 614-629.
[19] KIRKPATRICK J, PASCANU R, RABINOWITZ N, et al. Overcoming catastrophic forgetting in neural networks[J]. Proceedings of the National Academy of Sciences, National Academy of Sciences, 2017, 114(13): 3521-3526.
[20] RUSU AA, RABINOWITZ NC, DESJARDINS G, et al. Progressive neural networks[J/OL]. arXiv preprint arXiv:1606.04671 [cs], 2016.
[21] LV, WANG, LIU, et al. Sentiment classification by leveraging the shared knowledge from a sequence of domains[G]//Database Systems for Advanced Applications. Cham: Springer International Publishing, 2019: 795-811.
[22] SUN F K, HO C H, LEE H Y. LAMOL: Language modeling for lifelong language learning[J/OL]. arXiv preprint arXiv: 1909.03329 [cs], 2019.
[23] NIGAM K, MCCALLUM A, THRUN S, et al. Learning to classify text from labeled and unlabeled documents[C]//Proceedings of AAAI. Madison, Wisconsin, USA: American Association for Artificial Intelligence, 1998: 792-799.
[24] 程艳,叶子铭,王明文,等.融合卷积神经网络与层次化注意力网络的中文文本情感倾向性分析[J].中文信息学报,2019,33(1):133-142.
[25] 赵冬梅,李雅,陶建华,等. 基于协同过滤Attention机制的情感分析模型[J].中文信息学报,2018,32(8):128-134.
[26] 吴小华,陈莉,魏甜甜,等. 基于Self-attention和Bi-LSTM的中文短文本情感分析[J].中文信息学报,2019,33(6):100-107.
[27] CHANG C-C, LIN C-J. LIBSVM: A library for support vector machines[J]. ACM T INTEL SYST TEC, 2011, 2(3): 27:1-27.
[28] JOULIN A, GRAVE E, BOJANOWSKI P, et al. Bag of Tricks for Efficient Text Classification[C]//Proceedings of EACL. Spain: Association for Computational Linguistics, 2017: 427-31.
[29] KIM Y. Convolutional neural networks for sentence classification[C]//Proceedings of EMNLP. Doha, Qatar: Association for Computational Linguistics, 2014: 1746-51.
[30] LIU, QIU, HUANG. Recurrent neural network for text classification with multi-task learning[C]//Proceedings of the 25th International Joint Conference on Artificial Intelligence. New York, USA: AAAI Press, 2016: 2873-9.
[31] JOHNSON R, ZHANG. Deep pyramid convolutional neural networks for text categorization[C]//Proceedings of ACL. Vancouver, Canada: Association for Computational Linguistics, 2017:562-70.
[32] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of NIPS. NY, USA: Curran Associates Inc., 2017: 6000-6010.
[33] LI, HU, et al. Analogical reasoning on Chinese morphological and semantic relations[C]//Proceedings of ACL,2018: 138-143.

基金

自治区自然科学基金(2018D01C075);教育厅高校科研青年项目(61021800032,61021211418);自治区高层次创新人才项目(100400016,042419006)
PDF(4076 KB)

618

Accesses

0

Citation

Detail

段落导航
相关文章

/