2011 | Maximilian Nickel, Volker Tresp, Hans-Peter Kriegel
This paper presents a novel approach to relational learning based on the factorization of a three-way tensor, called RESCAL. Unlike other tensor approaches, RESCAL enables collective learning through the latent components of the model and provides an efficient algorithm for computing the factorization. The method is evaluated on both a new dataset and a commonly used dataset for entity resolution, showing better or comparable results to current state-of-the-art relational learning solutions while being significantly faster to compute.
Relational learning deals with domains where entities are interconnected by multiple relations. Correlations can occur between entities, relations, and their interconnections. RESCAL models this by using a tensor factorization that captures the interactions between entities and relations. The model is based on a rank-r factorization where each slice of the tensor is factorized as $ \mathcal{X}_{k} \approx A R_{k} A^{T} $, with A representing the latent-component representation of entities and $ R_{k} $ modeling the interactions of the latent components in the k-th predicate.
The model is compared to other tensor factorizations like CP and DEDICOM, and shown to perform better in terms of accuracy and computational efficiency. RESCAL is also evaluated on benchmark datasets for collective classification and entity resolution, demonstrating its effectiveness in these tasks. The algorithm uses an alternating least-squares approach to compute the factorization, which is efficient and scalable.
The paper also discusses the computational advantages of RESCAL, including its ability to handle large datasets and its efficiency in terms of runtime performance. RESCAL is shown to outperform other methods in terms of both accuracy and speed, particularly in tasks involving collective learning. The results indicate that RESCAL is a promising approach for relational learning, especially in domains where collective learning is important. The paper concludes with a discussion of future work, including the investigation of distributed versions of RESCAL and a stochastic gradient descent approach to the optimization problem.This paper presents a novel approach to relational learning based on the factorization of a three-way tensor, called RESCAL. Unlike other tensor approaches, RESCAL enables collective learning through the latent components of the model and provides an efficient algorithm for computing the factorization. The method is evaluated on both a new dataset and a commonly used dataset for entity resolution, showing better or comparable results to current state-of-the-art relational learning solutions while being significantly faster to compute.
Relational learning deals with domains where entities are interconnected by multiple relations. Correlations can occur between entities, relations, and their interconnections. RESCAL models this by using a tensor factorization that captures the interactions between entities and relations. The model is based on a rank-r factorization where each slice of the tensor is factorized as $ \mathcal{X}_{k} \approx A R_{k} A^{T} $, with A representing the latent-component representation of entities and $ R_{k} $ modeling the interactions of the latent components in the k-th predicate.
The model is compared to other tensor factorizations like CP and DEDICOM, and shown to perform better in terms of accuracy and computational efficiency. RESCAL is also evaluated on benchmark datasets for collective classification and entity resolution, demonstrating its effectiveness in these tasks. The algorithm uses an alternating least-squares approach to compute the factorization, which is efficient and scalable.
The paper also discusses the computational advantages of RESCAL, including its ability to handle large datasets and its efficiency in terms of runtime performance. RESCAL is shown to outperform other methods in terms of both accuracy and speed, particularly in tasks involving collective learning. The results indicate that RESCAL is a promising approach for relational learning, especially in domains where collective learning is important. The paper concludes with a discussion of future work, including the investigation of distributed versions of RESCAL and a stochastic gradient descent approach to the optimization problem.