基于事件演化图与图卷积神经网络的事件预测

唐彦,陈逸,张作为

PDF(3975 KB)
PDF(3975 KB)
中文信息学报 ›› 2022, Vol. 36 ›› Issue (12) : 123-132.
信息抽取与文本挖掘

基于事件演化图与图卷积神经网络的事件预测

  • 唐彦,陈逸,张作为
作者信息 +

Event Prediction Based on Event Evolutionary Graph and GCN

  • TANG Yan, CHEN Yi, ZHANG Zuowei
Author information +
History +

摘要

该文从改进事件演化图构造和丰富事件表示的角度出发,提出了一种基于事件演化图和图卷积网络的事件预测模型。该模型采用事件抽取模型,结合频率和互信息重新定义事件演化图中边的权重。事件语境的表示由BiLSTM和记忆网络学习得到,并在事件演化图的指导下作为输入被馈送到GCN。最终的事件预测由这种事件关系感知、上下文感知和邻域感知的事件嵌入共同完成。在Gigaword基准数据集上的实验结果表明,所提出的模型在事件预测精度方面优于六个先进的模型,与其中最新的SGNN方法相比提高了5.55%。

Abstract

Focusing on improving the evolutionary graph construction and enriching the event representation, this paper proposes an event prediction model based on the event evolutionary graph and Graph Convolutional Network(GCN). This model applies an event extraction model, and redefines the edge’s weight on the event evolutionary graph by combining frequency and mutual information. The representation of the event context is learned by BiLSTM and memory network, which is fed as the input into GCN under the guidance of the event evolutionary graph. The final event prediction is jointly completed by such event-relationship aware, context-aware, and neighbor aware event embeddings. Experiment results on the Gigaword benchmark show that the proposed model outperforms six advanced models in event prediction accuracy, with 5.55% increase compared with the latest SGNN method.

关键词

事件预测 / 事件演化图 / 图卷积网络 / 记忆网络

Key words

event prediction / event evolutionary graph / graph convolutional network / memory network

引用本文

导出引用
唐彦,陈逸,张作为. 基于事件演化图与图卷积神经网络的事件预测. 中文信息学报. 2022, 36(12): 123-132
TANG Yan, CHEN Yi, ZHANG Zuowei. Event Prediction Based on Event Evolutionary Graph and GCN. Journal of Chinese Information Processing. 2022, 36(12): 123-132

参考文献

[1] Chambers N, Jurafsky D. Unsupervised learning of narrative event chains[C]//Proceedings of the Association for Computational Linguistics,2008: 789-797.
[2] Duvenaud D, Maclaurin D, Aguilera-iparraguirre J, et al. Convolutional networks on graphs for learning molecular fingerprints[J]//Proceedings of the Conference and Workshop on Neural Information Processing Systems, 2015: 2224-2232.
[3] Kipf T, Welling M. Semi-supervised classification with graph convolutional networks[C]//Proceedings of the 5th International Conference on Learning Representations, 2017.
[4] Li Y J, Daniel T, Marc B, et al. Gated graph sequence neural networks[C]//Proceedings of the International Conference on Learning Representations, 2016.
[5] Jans B, Bethard S, Vulic I, et al. Skip N-grams and ranking functions for predicting script events[C]//Proceedings of European Chapter of the Association for Computational Linguistics, 2012: 336-344.
[6] Pichotta K, Mooney R J. Learning statistical scripts with lstm recurrent neural networks[C]//Proceedings of the 30th Association for the Advancement of Artificial Intelligence Conference on Artificial Intelligence, 2015: 2800-2806.
[7] Glavas G, Snajder J. Construction and evaluation of event graphs[J]. Natural Language Engineering, 2015 21(4): 607-652.
[8] Luo W, Zhang H, Yang X, et al. Dynamic heterogeneous graph neural network for real-time event prediction[C]//Proceedings of KDD, 2020: 3213-3223.
[9] Ding D, Zhang M, Pan X, et al. Modeling extreme events in time series prediction[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2019: 1114-1122.
[10] Li Z, Ding X, Liu T. Constructing narrative event evolutionary graph for script event prediction[C]//Proceedings of International Joint Conference on Artificial Intelligence, 2018: 4201-4207.
[11] Zhang Z, Tang Y. Context-aware event type identification based on context fusion and joint learning[C]//Proceedings of IEEE 5th International Conference on Data Science in Cyberspace, 2020: 105-112.
[12] Pichotta K, Mooney R. Statistical script learning with multi-argument events[C]//Proceedings of European chapter of the Association for Computational Linguistics, 2014: 220-229.
[13] Pichotta K, Mooney R J. Using sentence-level lstm language models for script inference[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2016: 279-289.
[14] Wang Z, Zhang Y, Chang C Y. Integrating order information and event relation for script event prediction[C]//Proceedings of Empirical Methods in Natural Language Processing, 2017: 57-67.
[15] Zhao S,Wang Q, Massung S, et al. Constructing and embed-ding abstract event causality networks from text snip-pets[C]//Proceedings of the 10th ACM International Conference on Web Search and Data Mining, 2017: 335-344.
[16] Granroth-wilding M, Clark S. What happens next? Event prediction using a compositional neural network mod-el[C]//Proceedings of the Association for the Advancement of Artificial Intelligence. AAAI, 2016: 2727-2733.
[17] Graff D, Kong J, Chen K, et al. English Giga-word, LDC2003T05[C]//Proceedings of the Linguistic Data Consortium, Philadelphia, 2003.
[18] Wang Z, Zhang Y, Chang C Y. Integrating order information and event relation for script event prediction[C]//Proceedings of Empirical Methods in Natural Language Processing, 2017: 57-67.

基金

国家重点研发计划专题(2017YFC0405805)
PDF(3975 KB)

1927

Accesses

0

Citation

Detail

段落导航
相关文章

/