A Unifying Review of Linear Gaussian Models

A Unifying Review of Linear Gaussian Models

1999 | Sam Roweis, Zoubin Ghahramani
This paper presents a unifying review of linear Gaussian models, showing that various statistical techniques for modeling static and time-series data can be viewed as variations of a single basic generative model. The authors unify models such as factor analysis, principal component analysis (PCA), mixtures of Gaussian clusters, vector quantization, Kalman filters, and hidden Markov models (HMMs) under a common framework. They introduce a new way to link discrete and continuous state models using a simple nonlinearity and show how independent component analysis (ICA) can also be viewed as a variation of this model. The paper also introduces a new model for static data called sensible principal component analysis (SPCA) and a novel concept of spatially adaptive observation noise. It reviews the literature on global and local mixtures of these models and provides pseudocode for inference and learning. The basic model is a discrete-time linear dynamical system with Gaussian noise. The state x evolves according to a linear process, and the output y is generated from the current state by a linear observation process. Both the state evolution and observation processes are corrupted by additive Gaussian noise. The model is described by the equations: x_{t+1} = A x_t + w_t y_t = C x_t + v_t where A is the state transition matrix, C is the observation matrix, and w_t and v_t are Gaussian noise vectors. The model is characterized by the assumption that the state x is a lower-dimensional projection of the observation y, and that the state evolution and observation noises are temporally white and spatially Gaussian. The paper discusses the probability computations for the model, showing that the likelihood of the data can be computed using the Gaussian properties of the model. It then presents the learning and estimation problems, focusing on inference (filtering and smoothing) and system identification. The EM algorithm is used for learning the model parameters, and the paper shows how the EM algorithm can be applied to various models such as factor analysis, PCA, and HMMs. The paper also discusses continuous-state and discrete-state linear Gaussian models, showing how they can be used for static and dynamic data modeling. It introduces the concept of sensible principal component analysis (SPCA) for static data and discusses the implications of using different noise covariance structures. The paper concludes by showing how independent component analysis (ICA) can be viewed as a nonlinear extension of factor analysis and how the EM algorithm can be used for learning in various models.This paper presents a unifying review of linear Gaussian models, showing that various statistical techniques for modeling static and time-series data can be viewed as variations of a single basic generative model. The authors unify models such as factor analysis, principal component analysis (PCA), mixtures of Gaussian clusters, vector quantization, Kalman filters, and hidden Markov models (HMMs) under a common framework. They introduce a new way to link discrete and continuous state models using a simple nonlinearity and show how independent component analysis (ICA) can also be viewed as a variation of this model. The paper also introduces a new model for static data called sensible principal component analysis (SPCA) and a novel concept of spatially adaptive observation noise. It reviews the literature on global and local mixtures of these models and provides pseudocode for inference and learning. The basic model is a discrete-time linear dynamical system with Gaussian noise. The state x evolves according to a linear process, and the output y is generated from the current state by a linear observation process. Both the state evolution and observation processes are corrupted by additive Gaussian noise. The model is described by the equations: x_{t+1} = A x_t + w_t y_t = C x_t + v_t where A is the state transition matrix, C is the observation matrix, and w_t and v_t are Gaussian noise vectors. The model is characterized by the assumption that the state x is a lower-dimensional projection of the observation y, and that the state evolution and observation noises are temporally white and spatially Gaussian. The paper discusses the probability computations for the model, showing that the likelihood of the data can be computed using the Gaussian properties of the model. It then presents the learning and estimation problems, focusing on inference (filtering and smoothing) and system identification. The EM algorithm is used for learning the model parameters, and the paper shows how the EM algorithm can be applied to various models such as factor analysis, PCA, and HMMs. The paper also discusses continuous-state and discrete-state linear Gaussian models, showing how they can be used for static and dynamic data modeling. It introduces the concept of sensible principal component analysis (SPCA) for static data and discusses the implications of using different noise covariance structures. The paper concludes by showing how independent component analysis (ICA) can be viewed as a nonlinear extension of factor analysis and how the EM algorithm can be used for learning in various models.
Reach us at info@study.space