Variational free energy and the Laplace approximation

Variational free energy and the Laplace approximation

2006 | Karl Friston, Jérémie Mattout, Nelson Trujillo-Barreto, John Ashburner, and Will Penny
This paper derives the variational free energy under the Laplace approximation, focusing on accounting for additional model complexity introduced by increasing the number of model parameters. It is relevant when using the free energy as an approximation to the log-evidence in Bayesian model averaging and selection. By setting restricted maximum likelihood (ReML) within the context of variational learning and expectation maximisation (EM), the paper shows how the ReML objective function can be adjusted to approximate the log-evidence for a particular model. This enables ReML to be used for model selection, specifically to select or compare models with different covariance components. This is useful in hierarchical models, as it allows for a principled selection of priors that can be used for automatic model selection and relevance determination (ARD). The paper also shows that EM is formally identical to a full variational treatment when the precisions are linear in the hyperparameters. It briefly considers dynamic models and how they inform the regularisation of free energy ascent schemes like EM and ReML. The paper derives the ReML objective function from basic variational principles, revealing the simple relationships among Variational Bayes, EM, and ReML. It also discusses the use of ReML in hierarchical models and how it can be used to estimate covariance components. The paper concludes that ReML can be used for model selection, as it allows for the automatic selection of the best model by switching off redundant parameters and hyperparameters. The paper also discusses the use of ReML in neuroimaging applications, including source reconstruction in EEG and MEG.This paper derives the variational free energy under the Laplace approximation, focusing on accounting for additional model complexity introduced by increasing the number of model parameters. It is relevant when using the free energy as an approximation to the log-evidence in Bayesian model averaging and selection. By setting restricted maximum likelihood (ReML) within the context of variational learning and expectation maximisation (EM), the paper shows how the ReML objective function can be adjusted to approximate the log-evidence for a particular model. This enables ReML to be used for model selection, specifically to select or compare models with different covariance components. This is useful in hierarchical models, as it allows for a principled selection of priors that can be used for automatic model selection and relevance determination (ARD). The paper also shows that EM is formally identical to a full variational treatment when the precisions are linear in the hyperparameters. It briefly considers dynamic models and how they inform the regularisation of free energy ascent schemes like EM and ReML. The paper derives the ReML objective function from basic variational principles, revealing the simple relationships among Variational Bayes, EM, and ReML. It also discusses the use of ReML in hierarchical models and how it can be used to estimate covariance components. The paper concludes that ReML can be used for model selection, as it allows for the automatic selection of the best model by switching off redundant parameters and hyperparameters. The paper also discusses the use of ReML in neuroimaging applications, including source reconstruction in EEG and MEG.
Reach us at info@study.space
[slides and audio] Variational free energy and the Laplace approximation