7 Dec 2015 | Maximilian Nickel, Lorenzo Rosasco, Tomaso Poggio
The paper introduces Holographic Embeddings (HoLE), a novel method for learning compositional vector space representations of knowledge graphs. HoLE uses circular correlation to create compositional representations, which captures rich interactions while remaining efficient to compute, easy to train, and scalable to large datasets. The method is related to holographic models of associative memory, where circular correlation is used to store and retrieve information. Experimental results show that HoLE outperforms state-of-the-art methods on various benchmark datasets for link prediction and relational learning tasks. The paper also discusses the connections between HoLE and holographic models of associative memory, highlighting how HoLE can be interpreted in this context.The paper introduces Holographic Embeddings (HoLE), a novel method for learning compositional vector space representations of knowledge graphs. HoLE uses circular correlation to create compositional representations, which captures rich interactions while remaining efficient to compute, easy to train, and scalable to large datasets. The method is related to holographic models of associative memory, where circular correlation is used to store and retrieve information. Experimental results show that HoLE outperforms state-of-the-art methods on various benchmark datasets for link prediction and relational learning tasks. The paper also discusses the connections between HoLE and holographic models of associative memory, highlighting how HoLE can be interpreted in this context.