Knowledge Graph Embedding via Dynamic Mapping Matrix

Knowledge Graph Embedding via Dynamic Mapping Matrix

July 26-31, 2015 | Guoliang Ji, Shizhu He, Liheng Xu, Kang Liu and Jun Zhao
This paper proposes a knowledge graph embedding model called TransD, which improves upon previous methods like TransE, TransH, and TransR/CTransR. TransD represents each entity and relation with two vectors: one capturing their meaning and another used to dynamically construct mapping matrices. This allows for more fine-grained modeling of entities and relations, enabling better handling of diverse relationships. Unlike TransR/CTransR, which uses fixed mapping matrices for relations, TransD dynamically constructs mapping matrices based on both entities and relations, leading to fewer parameters and no matrix-vector multiplication operations. This makes TransD more efficient and suitable for large-scale knowledge graphs. The model is evaluated on two tasks: triplets classification and link prediction. Experimental results show that TransD outperforms previous methods in both tasks, particularly in link prediction. The model's dynamic mapping matrices allow for more accurate representation of entities and relations, making it effective for knowledge graph completion. The paper also discusses the properties of projection vectors used in TransD and highlights the importance of considering the diversity of entities and relations in knowledge graphs. Overall, TransD provides a more flexible and efficient approach to knowledge graph embedding.This paper proposes a knowledge graph embedding model called TransD, which improves upon previous methods like TransE, TransH, and TransR/CTransR. TransD represents each entity and relation with two vectors: one capturing their meaning and another used to dynamically construct mapping matrices. This allows for more fine-grained modeling of entities and relations, enabling better handling of diverse relationships. Unlike TransR/CTransR, which uses fixed mapping matrices for relations, TransD dynamically constructs mapping matrices based on both entities and relations, leading to fewer parameters and no matrix-vector multiplication operations. This makes TransD more efficient and suitable for large-scale knowledge graphs. The model is evaluated on two tasks: triplets classification and link prediction. Experimental results show that TransD outperforms previous methods in both tasks, particularly in link prediction. The model's dynamic mapping matrices allow for more accurate representation of entities and relations, making it effective for knowledge graph completion. The paper also discusses the properties of projection vectors used in TransD and highlights the importance of considering the diversity of entities and relations in knowledge graphs. Overall, TransD provides a more flexible and efficient approach to knowledge graph embedding.
Reach us at info@study.space