Knowledge Representation and Acquisition
XU Yao, HE Shizhu, LIU Kang, ZHANG Chi, JIAO Fei, ZHAO Jun
2022, 36(10): 54-62.
In recent years, embedding models for deterministic knowledge graph have made great progress in tasks such as knowledge graph completion. However, how to design and train embedding models for uncertain knowledge graphs is still an important challenge. Different from deterministic knowledge graphs, each fact triple of uncertain knowledge graph has a corresponding confidence. Therefore, the uncertain knowledge graph embedding model needs to accurately calculate the confidence of each triple. The existing uncertain knowledge graph embedding model with relatively simple structure can only deal with symmetric relations, and cannot handle the false-negative problem well. Aiming to solve the above problems, we first propose a unified framework for training uncertain knowledge graph embedding models. The framework uses a multi-model based semi-supervised learning method to train uncertain knowledge graph embedding models. In order to solve the problem of excessive noise in semi-supervised samples, we also use Monte Carlo Dropout to calculate the uncertainty of the model on the output results, and effectively filter the noisy data in semi-supervised samples according to this uncertainty. In addition, in order to better represent the uncertainty of entities and relationships in uncertain knowledge graph to deal with more complex relations, we also propose an uncertain knowledge graphs embedding model UBetaE based on Beta distribution, which represents both entities and relations as a set of mutually independent Beta distributions. The experimental results on the public dataset show that the combination of the semi-supervised learning method and UBetaE model proposed in this paper not only greatly alleviates the false-negative problem, but also significantly outperforms the current SOTA uncertain knowledge graph embedding models such as UKGE in multiple tasks.