%0 Journal Article %A XIONG Kai %A DU Li %A DING Xiao %A LIU Ting %A QIN Bing %A FU Bo %T Knowledge Enhanced Pre-trained Language Model for Textual Inference %D 2023 %R %J Journal of Chinese Information Processing %P 27-35 %V 36 %N 12 %X Although the pre-trained language model has achieved high performance on a large number of natural language processing tasks, the knowledge contained in some pre-trained language models is difficult to support more efficient textual inference. Focused on using a wealth of knowledge to enhance the pre-trained language model for textual inference, we propose a framework for textual inference to integrate the knowledge of graphs and graph structures into the pre-trained language model. Experiments on two subtasks of textual inference indicate our framework outperforms a series of baseline methods. %U http://jcip.cipsc.org.cn/EN/abstract/article_3438.shtml