李宏宇,段利国,候晨蕾,姚龙飞. 基于CWHC-AM的实体及关系联合抽取方法[J]. 中文信息学报, 2022, 36(11): 79-90.
LI Hongyu, DUAN Liguo, HOU Chenlei, YAO Longfei. Joint Extraction of Entities and Relations Based on CWHC-AM. , 2022, 36(11): 79-90.
Joint Extraction of Entities and Relations Based on CWHC-AM
LI Hongyu1, DUAN Liguo1,2, HOU Chenlei1, YAO Longfei1
1.College of Information and Computer, Taiyuan University of Technology, Taiyuan, Shanxi 030024, China; 2.Transfer Prepatory Office, Shanxi Normal University College of Modern Art and Sciences, Linfen, Shanxi 041000, China
摘要实体及关系抽取是从非结构化自然语言文本中抽取三元组。传统流水线的方法先抽取实体再抽取关系,容易造成误差传播,也忽略了两个子任务的内在联系和依赖关系,抽取多元关系及重叠关系效果较差。针对上述问题,该文首先将多元关系问题转换成多个二元关系问题进行抽取,充分考虑两个子任务之间的联系,提出一种基于CWHC-AM(character word hybrid coding and attention mechanism)的实体及关系联合抽取模型,采用多层指针网络标注方案,将实体及关系联合抽取任务转化为序列标注问题,实现重叠关系抽取。最后,引入对抗训练提高模型的鲁棒性。在百度DuIE 2.0中文数据集上进行实验,结果表明该文方法可有效地同时抽取多元关系及二元关系,取得比基线模型都要好的效果。
Abstract：Entity and relationship extraction is to extract triples from unstructured natural language text. The existing pipeline method extracts entities first and then relations, without capturing the internal relations and dependencies of the two subtasks. This article proposes a joint extraction model of entities and relations based on CWHC-AM, regarding the multiple relationship issue in extraction as multiple binary relationship tasks. The multi-layer pointer network labeling scheme is adopted to transform the joint extraction task of entities and relations into a sequence labeling problem. And the adversarial training is introduced to improve the robustness of the model. Experiments on the Baidu DuIE2.0 Chinese dataset show that the method in this article can effectively extract multiple relations and binary relations at the same time with better results than the baseline model.
 刘知远,孙茂松,林衍凯,等.知识表示学习研究进展[J].计算机研究与发展,2016,53(2): 247.  ZENG X,ZENG D,HE S,et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics,2018: 506-514.  FU T J,LI P H,MA W Y. GraphRel: Modeling text as relational graphs for joint entity and relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics,2019: 1409-1418.  庄传志,靳小龙,朱伟建,等.基于深度学习的关系抽取研究综述[J].中文信息学报,2019,33(12): 1-18.  YU X,LAM W. Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach[C]//Roceedings of the Coling: Posters,2010: 1399-1407.  MIWA M,SASAKI Y. Modeling joint entity and relation extraction with table representation[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing,2014: 1858-1869.  GUPTA P,SCHüTZE H,ANDRASSY B. Table filling multi-task recurrent neural network for joint entity and relation extraction[C]//Proceedings of COLING,the 26th International Conference on Computational Linguistics: Technical Papers,2016: 2537-2547.  ZHENG S,WANG F,BAO H,et al. Joint extraction of entities and relations based on a novel tagging scheme[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics,2017: 1227-1236.  BEKOULIS G,DELEU J,DEMEESTER T,et al. Joint entity recognition and relation extraction as a multi-head selection problem[J].Expert Systems with Applications,2018,114: 34-45.  LI X,YIN F,SUN Z,et al. Entity-relation extraction as multi-turn question answering[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics,2019: 1340-1350.  DAI D,XIAO X,LYU Y,et al. Joint extraction of entities and overlapping relations using position-attentive sequence labeling[C]//Proceedings of the AAAI Conference on Artificial Intelligence,2019,33(01): 6300-6308.  YU B,ZHANG Z,SHU X,et al. Joint extraction of entities and relations based on a novel decomposition strategy[J]. arXiv preprint arXiv: 1909.04273,2019.  YUAN Y,ZHOU X,PAN S,et al. A relation-specific attention network for joint entity and relation extraction [C]//Proceedings of the 29th International Joint Conferences on Artificial Intelligence,2020: 4054-4060.  WEI Z,SU J,WANG Y,et al. A novel cascade binary tagging framework for relational triple extraction[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics,2020: 1476-1488.  GAMALLO P,GARCIA M,FERNNDEZLANZA S. Dependency-based open information extraction[C]//Proceedings of the Joint Workshop on Unsupervised and Semi-supervised Learning in NLP,2012: 10-18.  李颖,郝晓燕,王勇.中文开放式多元实体关系抽取[J].计算机科学,2017,44(Z6): 80-83.  GOODFELLOW I J,SHLENS J,SZEGEDY C. Explaining and harnessing adversarial examples[J]. arXiv preprint arXiv: 1412.6572,2014.  DEVLIN J,CHANG M W,LEE K,et al. BERT: Pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies,2019: 4171-4186.  LIU Y,OTT M,GOYAL N,et al. RoBERTa: A robustly optimized BERT pretraining approach[J]. arXiv preprint arXiv: 1907.11692,2019.  李伟康,李炜,吴云芳.深度学习中汉语字向量和词向量结合方式探究[J].中文信息学报,2017,31(6): 140-146.  YASUNAGA M,KASAI J,RADEV D. Robust multilingual part-of-speech tagging via adversarial training [C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies,2018: 976-986.  ZHOU P,SHI W,TIAN J,et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics,2016: 207-212.  CUI Y,CHE W,LIU T,et al. Pretraining with whole word masking for chinese BERT[J].arXiv preprint arXiv: 1906.08101,2019.  WANG Y,YU B,ZHANG Y,et al. TPLinker: Single-stage joint extraction of entities and relations through token pair linking[C]//Proceedings of the 28th International Conference on Computational Linguistics,2020: 1572-1582.  张龙辉,尹淑娟,任飞亮,等.BSLRel: 基于二元序列标注的级联关系三元组抽取模型[J].中文信息学报,2021,35(6): 74-84.