AAAI Publications, Twenty-Ninth AAAI Conference on Artificial Intelligence

Font Size: 
Learning Entity and Relation Embeddings for Knowledge Graph Completion
Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu

Last modified: 2015-02-19

Abstract


Knowledge graph completion aims to perform link prediction between entities. In this paper, we consider the approach of knowledge graph embeddings. Recently, models such as TransE and TransH build entity and relation embeddings by regarding a relation as translation from head entity to tail entity. We note that these models simply put both entities and relations within the same semantic space. In fact, an entity may have multiple aspects and various relations may focus on different aspects of entities, which makes a common space insufficient for modeling. In this paper, we propose TransR to build entity and relation embeddings in separate entity space and relation spaces. Afterwards, we learn embeddings by first projecting entities from entity space to corresponding relation space and then building translations between projected entities. In experiments, we evaluate our models on three tasks including link prediction, triple classification and relational fact extraction. Experimental results show significant and consistent improvements compared to state-of-the-art baselines including TransE and TransH.

Keywords


knowledge graph embedding; knowledge graph completion; relation extraction; knolwedge representation

Full Text: PDF