BSLRel:基于二元序列标注的级联关系三元组抽取模型

张龙辉,尹淑娟,任飞亮,苏剑林,明瑞成,白宇佳

PDF(2860 KB)
PDF(2860 KB)
中文信息学报 ›› 2021, Vol. 35 ›› Issue (6) : 74-84.
信息抽取与文本挖掘

BSLRel:基于二元序列标注的级联关系三元组抽取模型

  • 张龙辉1,尹淑娟1,任飞亮1,苏剑林2,明瑞成1,白宇佳1
作者信息 +

BSLRel: A Binary Sequence Labeling Based Cascading Relation Triple Extraction Model

  • ZHANG Longhui1, YIN Shujuan1, REN Feiliang1, SU Jianlin2, MING Ruicheng1, BAI Yujia1
Author information +
History +

摘要

关系三元组抽取是构建大规模知识图谱的基础,近年来受到学术界和工业界的广泛关注。为了提高模型对重叠关系三元组和多槽值关系三元组的抽取能力,该文提出了一个基于神经网络的端到端的关系三元组抽取模型BSLRel。其主要特点是将关系三元组抽取任务转化为级联的二元序列标注任务,并使用多信息融合结构Conditional Layer Normalization进行信息融合。实验结果显示,BSLRel模型对重叠关系三元组和多槽值关系三元组具有较强的抽取能力。基于BSLRel模型,该团队参加了“2020语言与智能技术竞赛”中的关系三元组抽取任务,并取得了第五名的成绩。

Abstract

Extracting relational triples is a basic task for large-scale knowledge graph construction. In order to improve the ability of extracting overlapped relation triples and multi-slot relation triples, this paper proposes BSLRel, an end-to-end relation triple extraction model based on neural network. Specifically, BSLRel model converts the relation triplet extraction task into a cascade binary sequence labeling task,which consists of a new multiple information fusion structure “Conditional Layer Normalization” to integrate information. With BSLRel, we participate in the “Relation Extraction” task organized by “the 2020 Language and Intelligence Challenge” and achieve Top 5 among all competitive models.

关键词

BSLRel模型 / 重叠关系三元组抽取 / 多槽值关系三元组抽取

Key words

BSLRel / overlapped relation triple / multi-slot relation triple

引用本文

导出引用
张龙辉,尹淑娟,任飞亮,苏剑林,明瑞成,白宇佳. BSLRel:基于二元序列标注的级联关系三元组抽取模型. 中文信息学报. 2021, 35(6): 74-84
ZHANG Longhui, YIN Shujuan, REN Feiliang, SU Jianlin, MING Ruicheng, BAI Yujia. BSLRel: A Binary Sequence Labeling Based Cascading Relation Triple Extraction Model. Journal of Chinese Information Processing. 2021, 35(6): 74-84

参考文献

[1] 庄传志,靳小龙,朱伟建,等.基于深度学习的关系抽取研究综述[J]. 中文信息学报, 2019, 33(12): 1-18.
[2] 白龙,靳小龙,席鹏弼,等. 基于远程监督的关系抽取研究综述[J]. 中文信息学报, 2019, 33(10): 10-17.
[3] Li S, He W, Shi Y, et al. DuIE: A large-scale Chinese dataset for information extraction[C]//Proceedings of CCF International Conference on Natural Language Processing and Chinese Computing, 2019: 791-800.
[4] Dong C H, Zhang J J, Zong C Q. Character-based LSTM-CRF with radical-level features for Chinese named entity recognition[C]//Proceedings of the 24th International Conference on Computer Processing of Oriental Languages, 2016: 239-250.
[5] Ling L, Zhihao Y, Pei Y, et al. An attention-based BiLSTM-CRF approach to document-level chemical named entity recognition[J]. Bioinformatics, 2018, 34(8): 1381-1388.
[6] Yan H, Deng B, Li X, et al. TENER: Adapting transformer encoder for named entity recognition[J].arXiv preprint arXiv: 1911.04474, 2019.
[7] Zhao S,Grishman R. Extracting relations with integrated information using kernel methods[C]//Proceedings of the 43th Annual Meeting on Association for Computational Linguistics, 2005: 419-426.
[8] Kambhatla Nanda. Combining lexical, syntactic, and semantic features with maximum entropy models for extracting relations[C]//Proceedings of the ACL 2004 or Interactive Paster and demanstration sessions, 2004: 22-25.
[9] Zelenko D, Aone C, Richardella A. Kernel methods for relation extraction[J]. Journal of Machine Learning Research, 2003, 3(3): 1083-1106.
[10] Zhou P, Shi W, Tian J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016: 120-129.
[11] Wu S, He Y. Enriching pre-trained language model with entity information for relation classification[C]//Proceedings of the 29th Conference on Information and Knowledge Management, 2019: 2361-2364.
[12] Yu X, Lam W. Jointly identifying entities and extracting relations in encyclopedia text via A Graphical Model Approach[C]//Proceedings of the 23rd International Conference on Computational Linguistics: Poster, 2010: 1399-1407.
[13] Li Q, Ji H. Incremental joint extraction of entity mentions and relations[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, 2014: 402-412.
[14] Makoto Miwa, Yutaka Sasaki. Modeling joint entity and relation extraction with table representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, 2014: 1858-1869.
[15] 宋睿,陈鑫,洪宇,等. 基于卷积循环神经网络的关系抽取[J]. 中文信息学报, 2019, 33(10): 64-72.
[16] Miwa M, Bansal M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016: 1105-1116.
[17] Zheng S, Wang F, Bao H, et al. Joint extraction of entities and relations based on a novel tagging scheme[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2017: 1227-1236.
[18] Yuan Y, Zhou X, Pan S, et al. A relation-specific attention network for joint entity and relation extraction[C]//Proceedings of the 29th International Joint Conferences on Artificial Intelligence, 2020: 4054-4060.
[19] Liu J, Chen S, Wang B, et al. Attention as relation: Learning supervised multi-head self-attention for relation extraction[C]//Proceedings of the 29th International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence, 2020: 3787-3793.
[20] Devlin J, Chang M W, Lee K, et al. BERT: Pretraining of deep bidirectional transformers for language understanding [C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171-4186.
[21] De Vries H, Strub F, Mary Jérémie, et al. Modulating early visual processing by language [J]. Neural Information Processing Systems,2017,11(19): 6594-6604.

基金

国家重点研发计划项目(2018YFC0830701);国家自然科学基金(61572120);中央高校基本科研业务专项资金(N181602013)
PDF(2860 KB)

Accesses

Citation

Detail

段落导航
相关文章

/