A Variational Bayesian Framework for Graphical Models

A Variational Bayesian Framework for Graphical Models

| Hagai Attias
This paper presents a novel practical framework for Bayesian model averaging and selection in probabilistic graphical models. The approach approximates full posterior distributions over model parameters, structures, and latent variables in an analytical manner. These posteriors emerge from a free-form optimization procedure that naturally incorporates conjugate priors. Unlike large sample approximations, the posteriors are generally non-Gaussian and no Hessian needs to be computed. Predictive quantities are obtained analytically, and the resulting algorithm generalizes the standard Expectation Maximization (EM) algorithm, with guaranteed convergence. The approach is demonstrated to be applicable to a wide range of models, including mixture models and source separation. The paper introduces a Variational Bayesian (VB) framework for Bayesian computations in graphical models. VB combines variational ideas from intractable latent variable models and Bayesian inference. This framework enables analytical calculations of posterior distributions over hidden variables, parameters, and structures. The posteriors are obtained through a free-form optimization procedure that naturally incorporates conjugate priors and results in standard forms, not just Gaussian. The algorithm is closely related to EM and guarantees convergence. No Hessian is needed, and predictive quantities can be computed analytically. Model selection is performed using the posterior over structure, with BIC/MDL criteria emerging as limiting cases. The VB framework is applied to directed acyclic graphs (DAGs), including Bayesian networks. The paper discusses the computation of marginal likelihood and posterior over parameters, and how these are approximated using a variational approach. The VB objective function penalizes complex models, and in the large sample limit, it corresponds to BIC/MDL. The framework is shown to be effective in mixture models, where it is applied to the Boston housing dataset and the Buffalo post office dataset, achieving competitive results compared to other methods. The paper also discusses the application of VB to intractable models, such as blind source separation. In this context, VB is used to estimate the number of sources and avoid overfitting. The VB-BSS algorithm is applied to speech and music signals, achieving accurate source reconstruction. The VB framework is applicable to a wide range of graphical models and can be integrated with the junction tree algorithm for general inference. The framework's ability to handle non-Gaussian posteriors and its computational efficiency make it a powerful tool for Bayesian inference in complex models. Current research efforts include applications to multinomial models and learning the structure of complex dynamic probabilistic networks.This paper presents a novel practical framework for Bayesian model averaging and selection in probabilistic graphical models. The approach approximates full posterior distributions over model parameters, structures, and latent variables in an analytical manner. These posteriors emerge from a free-form optimization procedure that naturally incorporates conjugate priors. Unlike large sample approximations, the posteriors are generally non-Gaussian and no Hessian needs to be computed. Predictive quantities are obtained analytically, and the resulting algorithm generalizes the standard Expectation Maximization (EM) algorithm, with guaranteed convergence. The approach is demonstrated to be applicable to a wide range of models, including mixture models and source separation. The paper introduces a Variational Bayesian (VB) framework for Bayesian computations in graphical models. VB combines variational ideas from intractable latent variable models and Bayesian inference. This framework enables analytical calculations of posterior distributions over hidden variables, parameters, and structures. The posteriors are obtained through a free-form optimization procedure that naturally incorporates conjugate priors and results in standard forms, not just Gaussian. The algorithm is closely related to EM and guarantees convergence. No Hessian is needed, and predictive quantities can be computed analytically. Model selection is performed using the posterior over structure, with BIC/MDL criteria emerging as limiting cases. The VB framework is applied to directed acyclic graphs (DAGs), including Bayesian networks. The paper discusses the computation of marginal likelihood and posterior over parameters, and how these are approximated using a variational approach. The VB objective function penalizes complex models, and in the large sample limit, it corresponds to BIC/MDL. The framework is shown to be effective in mixture models, where it is applied to the Boston housing dataset and the Buffalo post office dataset, achieving competitive results compared to other methods. The paper also discusses the application of VB to intractable models, such as blind source separation. In this context, VB is used to estimate the number of sources and avoid overfitting. The VB-BSS algorithm is applied to speech and music signals, achieving accurate source reconstruction. The VB framework is applicable to a wide range of graphical models and can be integrated with the junction tree algorithm for general inference. The framework's ability to handle non-Gaussian posteriors and its computational efficiency make it a powerful tool for Bayesian inference in complex models. Current research efforts include applications to multinomial models and learning the structure of complex dynamic probabilistic networks.
Reach us at info@futurestudyspace.com
Understanding A Variational Bayesian Framework for Graphical Models