Holographic Embeddings of Knowledge Graphs

Holographic Embeddings of Knowledge Graphs

7 Dec 2015 | Maximilian Nickel, Lorenzo Rosasco, Tomaso Poggio
This paper introduces HOLE, a novel compositional vector space model for knowledge graphs (KGs) that uses circular correlation to create efficient and scalable representations of relational data. HOLE is designed to capture rich interactions between entities and relations while maintaining computational efficiency and scalability. The model is based on the concept of holographic models of associative memory, where circular correlation is used as the compositional operator to generate composite representations of entity pairs. Unlike traditional tensor product models, which require a large number of parameters, HOLE uses a fixed-width representation that maintains the same dimensionality as the individual entity representations, making it more efficient in terms of memory and computation. The model is trained using stochastic gradient descent to minimize a logistic loss function, allowing it to learn embeddings that effectively predict relational patterns in KGs. Experimental results show that HOLE outperforms state-of-the-art methods on benchmark datasets for link prediction and relational learning tasks. The model is also shown to be effective in capturing complex relational patterns while being highly parameter-efficient. HOLE's connection to holographic models of associative memory provides a theoretical foundation for its effectiveness in relational learning and opens up new possibilities for querying and reasoning with knowledge graphs. The paper also discusses the scalability and efficiency of HOLE, demonstrating its ability to handle large datasets and complex relational structures. Overall, HOLE offers a promising approach to learning from relational data in knowledge graphs, combining the expressive power of compositional models with the efficiency of simple models like TRANSE.This paper introduces HOLE, a novel compositional vector space model for knowledge graphs (KGs) that uses circular correlation to create efficient and scalable representations of relational data. HOLE is designed to capture rich interactions between entities and relations while maintaining computational efficiency and scalability. The model is based on the concept of holographic models of associative memory, where circular correlation is used as the compositional operator to generate composite representations of entity pairs. Unlike traditional tensor product models, which require a large number of parameters, HOLE uses a fixed-width representation that maintains the same dimensionality as the individual entity representations, making it more efficient in terms of memory and computation. The model is trained using stochastic gradient descent to minimize a logistic loss function, allowing it to learn embeddings that effectively predict relational patterns in KGs. Experimental results show that HOLE outperforms state-of-the-art methods on benchmark datasets for link prediction and relational learning tasks. The model is also shown to be effective in capturing complex relational patterns while being highly parameter-efficient. HOLE's connection to holographic models of associative memory provides a theoretical foundation for its effectiveness in relational learning and opens up new possibilities for querying and reasoning with knowledge graphs. The paper also discusses the scalability and efficiency of HOLE, demonstrating its ability to handle large datasets and complex relational structures. Overall, HOLE offers a promising approach to learning from relational data in knowledge graphs, combining the expressive power of compositional models with the efficiency of simple models like TRANSE.
Reach us at info@study.space