Knowledge Representation and Acquisition
DU Wenqian, LI Bicheng, WANG Rui
2020, 34(7): 50-59.
Representation learning of knowledge graph aims to project entities and relations into continuous low-dimensional vector space. Most existing translation-based representation methods, such as TransE、TransH and TransR, usually utilize only triples of knowledge graph, and fail to deal with complex relationships such as one-to-many, many-to-one, and many-to-many. To address this issue, this paper proposes a representation learning model of knowledge graph integrating entity description and type, which is called TDT model. Firstly, the Doc2Vec model is used to obtain the embedding of all entity descriptions. Secondly, treating the hierarchical types as projection matrices for entities, the embedding of entity type information can be obtained via multiplying the projection matrix with triple embedding. Finally, TDT model integrates the information of triple(T), entity description(D), and entity type information(T) in a low-dimensional vector space. This paper evaluates TDT model via the experiments of link prediction and triple classification on the real-world datasets. The results show that new method significantly outperforms other baselines, such as TransE, TransR, DKRL and SimplE etc.