Tensor Decompositions for Learning Latent Variable Models

Tensor Decompositions for Learning Latent Variable Models

2014 | Animashree Anandkumar, Rong Ge, Daniel Hsu, Sham M. Kakade, Matus Telgarsky
This paper explores a computationally and statistically efficient method for parameter estimation in a wide class of latent variable models, including Gaussian mixture models, hidden Markov models, and Latent Dirichlet Allocation. The method leverages the tensor structure in low-order observable moments, typically second- and third-order moments, to reduce the parameter estimation problem to the extraction of a symmetric orthogonal decomposition of a tensor. This decomposition is a generalization of the singular value decomposition for matrices and can be efficiently computed using various approaches, such as power iterations and maximization methods. The paper provides a detailed analysis of a robust tensor power method, establishing an analogue of Wedin's perturbation theorem for the singular vectors of matrices. This analysis ensures robust and computationally tractable estimation for several popular latent variable models. The paper also addresses practical issues in applying tensor decomposition methods to latent variable models, demonstrating that basic operations can be executed efficiently in time linear in the dimension of the observations and the size of the training data.This paper explores a computationally and statistically efficient method for parameter estimation in a wide class of latent variable models, including Gaussian mixture models, hidden Markov models, and Latent Dirichlet Allocation. The method leverages the tensor structure in low-order observable moments, typically second- and third-order moments, to reduce the parameter estimation problem to the extraction of a symmetric orthogonal decomposition of a tensor. This decomposition is a generalization of the singular value decomposition for matrices and can be efficiently computed using various approaches, such as power iterations and maximization methods. The paper provides a detailed analysis of a robust tensor power method, establishing an analogue of Wedin's perturbation theorem for the singular vectors of matrices. This analysis ensures robust and computationally tractable estimation for several popular latent variable models. The paper also addresses practical issues in applying tensor decomposition methods to latent variable models, demonstrating that basic operations can be executed efficiently in time linear in the dimension of the observations and the size of the training data.
Reach us at info@study.space