Learning Entity and Relation Embeddings for Knowledge Graph Completion

Learning Entity and Relation Embeddings for Knowledge Graph Completion

2015 | Yankai Lin, Zhiyuan Liu*, Maosong Sun, Yang Liu, Xuan Zhu
This paper addresses the challenge of knowledge graph completion, which involves predicting missing relations between entities. The authors propose TransR, a novel approach that models entities and relations in separate entity and relation spaces, improving upon existing models like TransE and TransH. TransR projects entities from the entity space to the corresponding relation space and then performs translations between projected entities. To handle diverse patterns within specific relations, the authors introduce Cluster-based TransR (CTransR), which clusters entity pairs under the same relation and learns distinct relation vectors for each cluster. Experimental results on benchmark datasets (WordNet and Freebase) show that TransR and CTransR significantly outperform state-of-the-art models in link prediction, triple classification, and relational fact extraction tasks. The paper also discusses the training method, implementation details, and future directions, including exploring more sophisticated models for internal correlations within relations and combining text-based relation extraction models with knowledge graph embeddings.This paper addresses the challenge of knowledge graph completion, which involves predicting missing relations between entities. The authors propose TransR, a novel approach that models entities and relations in separate entity and relation spaces, improving upon existing models like TransE and TransH. TransR projects entities from the entity space to the corresponding relation space and then performs translations between projected entities. To handle diverse patterns within specific relations, the authors introduce Cluster-based TransR (CTransR), which clusters entity pairs under the same relation and learns distinct relation vectors for each cluster. Experimental results on benchmark datasets (WordNet and Freebase) show that TransR and CTransR significantly outperform state-of-the-art models in link prediction, triple classification, and relational fact extraction tasks. The paper also discusses the training method, implementation details, and future directions, including exploring more sophisticated models for internal correlations within relations and combining text-based relation extraction models with knowledge graph embeddings.
Reach us at info@study.space
[slides] Learning Entity and Relation Embeddings for Knowledge Graph Completion | StudySpace