A Variational Bayesian Framework for Graphical Models

A Variational Bayesian Framework for Graphical Models

| Hagai Attias
This paper introduces a novel practical framework for Bayesian model averaging and selection in probabilistic graphical models, known as Variational Bayes (VB). The approach approximates full posterior distributions over model parameters, structures, and latent variables analytically. Unlike large sample approximations, the posteriors are generally non-Gaussian and do not require computing the Hessian. Predictive quantities are obtained analytically, and the resulting algorithm generalizes the standard Expectation Maximization (EM) algorithm with guaranteed convergence. The VB framework is applicable to a wide range of models, including mixture models and source separation, and can handle complex models by penalizing complexity through the KL divergence. The paper demonstrates the effectiveness of VB through applications to mixture models and blind source separation, showing competitive performance compared to traditional methods.This paper introduces a novel practical framework for Bayesian model averaging and selection in probabilistic graphical models, known as Variational Bayes (VB). The approach approximates full posterior distributions over model parameters, structures, and latent variables analytically. Unlike large sample approximations, the posteriors are generally non-Gaussian and do not require computing the Hessian. Predictive quantities are obtained analytically, and the resulting algorithm generalizes the standard Expectation Maximization (EM) algorithm with guaranteed convergence. The VB framework is applicable to a wide range of models, including mixture models and source separation, and can handle complex models by penalizing complexity through the KL divergence. The paper demonstrates the effectiveness of VB through applications to mixture models and blind source separation, showing competitive performance compared to traditional methods.
Reach us at info@study.space