Learning Entity and Relation Embeddings for Knowledge Graph Completion

Learning Entity and Relation Embeddings for Knowledge Graph Completion

2015 | Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu
This paper proposes TransR, a knowledge graph embedding model that represents entities and relations in distinct semantic spaces. Unlike previous methods such as TransE and TransH, which assume entities and relations are in the same space, TransR models entities in an entity space and relations in relation-specific spaces. This allows for more accurate modeling of complex relationships between entities. TransR projects entities into the corresponding relation space and performs translations between projected entities. Additionally, the paper introduces CTransR, which extends TransR by clustering entity pairs under each relation and learning distinct relation vectors for each cluster. This approach improves the model's ability to capture complex correlations within each relation type. The proposed models are evaluated on three tasks: link prediction, triple classification, and relation fact extraction. Experimental results show that TransR and CTransR significantly outperform TransE and TransH in all tasks. TransR achieves consistent and significant improvements in link prediction and triple classification. CTransR further improves performance by capturing fine-grained correlations within each relation type. The "bern" sampling technique also enhances the performance of TransE, TransH, and TransR on all three tasks. In relation extraction from text, TransR is combined with a text-based relation extraction model to rank candidate facts. The results show that TransR outperforms TransE and is comparable with TransH in certain recall ranges. The paper also discusses future work, including exploring more sophisticated models for knowledge graph embeddings and integrating both text-based and knowledge graph-based approaches into a unified embedding model.This paper proposes TransR, a knowledge graph embedding model that represents entities and relations in distinct semantic spaces. Unlike previous methods such as TransE and TransH, which assume entities and relations are in the same space, TransR models entities in an entity space and relations in relation-specific spaces. This allows for more accurate modeling of complex relationships between entities. TransR projects entities into the corresponding relation space and performs translations between projected entities. Additionally, the paper introduces CTransR, which extends TransR by clustering entity pairs under each relation and learning distinct relation vectors for each cluster. This approach improves the model's ability to capture complex correlations within each relation type. The proposed models are evaluated on three tasks: link prediction, triple classification, and relation fact extraction. Experimental results show that TransR and CTransR significantly outperform TransE and TransH in all tasks. TransR achieves consistent and significant improvements in link prediction and triple classification. CTransR further improves performance by capturing fine-grained correlations within each relation type. The "bern" sampling technique also enhances the performance of TransE, TransH, and TransR on all three tasks. In relation extraction from text, TransR is combined with a text-based relation extraction model to rank candidate facts. The results show that TransR outperforms TransE and is comparable with TransH in certain recall ranges. The paper also discusses future work, including exploring more sophisticated models for knowledge graph embeddings and integrating both text-based and knowledge graph-based approaches into a unified embedding model.
Reach us at info@study.space
Understanding Learning Entity and Relation Embeddings for Knowledge Graph Completion