RAVA: 基于强化关联的12345热线事件分拨方法

承孝敏,陈钢,陈健鹏,佘祥荣,毛建

PDF(7700 KB)
PDF(7700 KB)
中文信息学报 ›› 2022, Vol. 36 ›› Issue (10) : 155-166,172.
情感分析与社会计算

RAVA: 基于强化关联的12345热线事件分拨方法

  • 承孝敏1,2,陈钢1,陈健鹏1,佘祥荣1,毛建1
作者信息 +

RAVA: An Reinforced-Association-Based Method for 12345 Hotline Events Allocation

  • CHENG Xiaomin1,2, CHEN Gang1, CHEN Jianpeng1, SHE Xiangrong1, MAO Jian1
Author information +
History +

摘要

近年来,12345热线智能化成为地方政府数字化转型的典型代表。为解决手工事件分拨耗时长、效率低下等问题,该文提出了一种基于强化关联的12345热线事件分拨方法(RAVA)。首先,RAVA模型通过指针生成网络构建事件画像并得到事件编码。其次,使用基于强化联系的注意力机制(ARA)来捕获事件编码与“三定”职责拼接向量中的关联信息,得到“事件-三定”匹配结果。然后,将匹配结果与部门简介编码拼接并输入分类器。最后,经过答案重排序后输出12345热线事件所对应的分拨部门。在芜湖市12345热线数据集上的实验结果证明,相较于其他基线分拨方法,RAVA方法在多项指标上取得了更好的分拨效果。

Abstract

The 12345 hotline has become a typical representation of digital transformation of local government in recent years. A reinforced-association-based event allocation (RAVA) method is proposed to address the low efficiency in manual allocation. Firstly, event portrait is constructed to obtain event encoding vectors by pointer-generator network. Then, an association reinforced attention (ARA) mechanism is used to capture the correlation of the concatenated vector of 'event-three-responsibility' to decide the result of 'event-three-responsibility'. The above results are concatenated with the department description vectors and then input into the classifier. Finally, the candidate answer is reordered to decide the final allocated department for 12345 hotline. Experimental results show that the RAVA method can achieve better results compared with several baseline methods on the Wuhu 12345 hotline datasets.

关键词

12345热线 / 事件画像 / 事件分拨 / 答案选择 / 强化关联

Key words

12345 hotline / event portrait / event allocation / answer selection / reinforced association

引用本文

导出引用
承孝敏,陈钢,陈健鹏,佘祥荣,毛建. RAVA: 基于强化关联的12345热线事件分拨方法. 中文信息学报. 2022, 36(10): 155-166,172
CHENG Xiaomin, CHEN Gang, CHEN Jianpeng, SHE Xiangrong, MAO Jian. RAVA: An Reinforced-Association-Based Method for 12345 Hotline Events Allocation. Journal of Chinese Information Processing. 2022, 36(10): 155-166,172

参考文献

[1] 孟天广,黄种滨,张小劲.政务热线驱动的超大城市社会治理创新: 以北京市“接诉即办”改革为例[J].公共管理学报,2021,18(02):1-12.
[2] 马晓亮,李应春,沈波,等.新冠疫情防控时期广州12345政府服务热线的运营对策分析[J]. 广东通信技术,2020,40(09):2-4.
[3] Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding[EB/OL]. arXiv preprint arXiv:1810.04805.
[4] Keivan B, Reza G. Hierarchical LSTM network for text classification[J]. SN Applied Sciences, 2019, 1(9):1-4.
[5] Wang M, Cai Q, Wang L , et al. Chinese news text classification based on attention-based CNN-BiLSTM[C]//Proceedings of the International Symposium on Multispectral Image Processing and Pattern Recognition, 2020: 1-5.
[6] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems,2017: 6000-6010.
[7] 郑跃平,马晓亮.利用政务热线数据推动智慧治理建设: 以广州政务热线为例[J].电子政务,2018(12):18-26.
[8] 樊博,于元婷.基于适应性结构化理论的政务数据质量影响因素研究: 以政务12345热线数据为例[J].图书情报知识,2021(02):13-24.
[9] 赵金旭,王宁,孟天广. 链接市民与城市: 超大城市治理中的热线问政与政府回应—基于北京市12345政务热线大数据分析[J]. 电子政务,2021(02):2-14.
[10] 彭晓, 梁艳, 许立言. 基于“12345”市民服务热线的城市公共管理问题挖掘与治理优化途径[J]. 北京大学学报(自然科学版),2020,56(04):721-731.
[11] 陈辉,黄强松,郑丹青,等. 自然语言分析技术在政务热线中的应用[J]. 江苏通信,2018,34(04):57-62.
[12] 邱文. 公众诉求事件关键数据的空间智能提取与分析[J]. 城市勘测,2020(02):27-30.
[13] See A, Liu P J, Manning C D. Get to the point: Summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017: 1073-1083.
[14] 王炳乾,宿绍勋,梁天新.基于BERT的多层标签指针网络事件抽取模型: 2020语言与智能技术竞赛事件抽取任务系统报告[J].中文信息学报,2021,35(07):81-88.
[15] 刘忠宝, 秦权, 赵文娟.微博环境下新冠肺炎疫情事件对网民情绪的影响分析[J]. 情报杂志,2021,40(02):138-145.
[16] 吴林,安璐,孙冉. 面向企业舆情监测的事件画像与高危人群预测研究[J]. 信息资源管理学报,2020,10(1):14-28.
[17] 李方方, 陈容华, 秦胜男,等.基于互联网大数据的事件智能抓取和画像[J]. 无线互联科技,2020,17(01):20-21.
[18] Zheng W G, Cheng H, YU J X, et al. Interactive natural language question answering over knowledge graphs[J]. Information Sciences, 2019, 481(13): 141-159.
[19] Wakchaure M, Kulkarni P. A scheme of answer selection in community question answering using machine learning technique[C]//Proceedings of the International Conference on Intelligent Computing and Control Systems, 2019: 879-883.
[20] Wen J H, Ma J W, Feng Y L, et al. Hybrid attentive answer selection in CQA with deep users modelling[C]//Proceedings of the 32th AAAI Conference on Artificial Intelligence, 2018: 2556-2563.
[21] Tay Y, Tuan L A, Hui S C. Multi-cast attention networks for retrieval-based question answering and response prediction[C]//Proceedings of the ACM Sigkdd International Conference, 2018: 2299-2308.
[22] Yang M, Chen L, Chen X J, et al. Knowledge-enhanced hierarchical attention for community question answering with multi-task and adaptive learning[C]//Proceedings of the International Joint Conference on Artificial Intelligence, 2019: 5349-5355.
[23] Cervone A, Khatri C, Goel R, et al. Natural language generation at scale: A case study for open domain question answering[C]//Proceedings of the 12th International Conference on Natural Language Generation, 2019: 453-462.
[24] Sun Y B, Tang D Y, Duan N, et al. Joint learning of question answering and question generation[J]. IEEE Transactions on Knowledge and Data Engineering, 2019, 32(5): 971-982.
[25] Yang Z, Dai Z, Yang Y, et al. XLNet: Generalized autoregressive pretraining for language understanding[C]//Proceedings of the 31th Conference on Advances in Neural Information Processing Systems, 2019: 5754-5764.
[26] Liu Y H, Ott M, Goyal N, et al. RoBERTa: A robustly optimized BERT pretraining approach[EB/OL]. https://arxiv.org/pdf/1907.11692.pdf[2019-01-13].
[27] Cui Y M, Che W X, Liu T, et al. Pre-training with whole word masking for Chinese BERT[EB/OL].https://arxiv.org/pdf/1906.08101.pdf[2019-06-19].
[28] Cortiz D. Exploring transformers in emotion recognition: a comparison of BERT, DistillBERT, RoBERTa, XLNet and ELECTRA[EB/OL].https://arxiv.org/ftp/arxiv/papers/2104/2104.02041.pdf[2021-04-05].
[29] Yang Z, Yang D, Dyer C, et al. Hierarchical attention networks for document classification[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2016: 1480-1489.
[30] Li X, Li Z, Xie H, et al. Merging statistical feature via adaptive gate for improved text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2021, 35(15): 13288-13296.
[31] Clark K, Luong M T, Le Q V, et al. Electra: pre-training text encoders as discriminators rather than generators[EB/OL].https://arxiv.org/pdf/2003.10555.pdf[2020-03-23].[32] Chopra S, Hadsell R, Lecun Y. Learning a similarity metric discriminatively, with application to face verification[C]//Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2015: 539-546.
[33] Yin W, H Schütze, Xiang B, et al. ABCNN: Attention-based convolutional neural network for modeling sentence pairs[J]. Transactions of the Association for Computational Linguistics, 2016(4): 259-272.

基金

2021年安徽省重点研究与开发计划(202104a05020071);2021年安徽省科技创新战略与软科学研究(202106f01050056)
PDF(7700 KB)

1630

Accesses

0

Citation

Detail

段落导航
相关文章

/