Marginal Likelihood from the Gibbs Output

Marginal Likelihood from the Gibbs Output

December 1995 | Siddhartha Chib
This paper presents a method for computing the marginal likelihood (marginal density of the data) and Bayes factors from the output of a Gibbs sampler. The marginal likelihood is a key component in Bayesian model comparison, as it represents the normalizing constant of the posterior density. The method described here avoids the computational challenges associated with directly estimating the marginal likelihood, which has traditionally been difficult due to the need for integrating the likelihood function over the prior. The approach leverages the fact that the marginal likelihood can be expressed as the product of the likelihood and the prior divided by the posterior density. By using the posterior draws obtained from the Gibbs sampler, the method estimates the posterior density at a high-density point, which allows for a more accurate and stable estimation of the marginal likelihood. This method is particularly useful in models where the full conditional densities have closed-form expressions, such as in conjugate priors and many applications of the Gibbs sampler. The paper illustrates the method in the context of probit regression and finite mixture models. For probit regression, the method is applied to a dataset on prostate cancer nodal involvement, where the marginal likelihood is estimated for various models. For finite mixture models, the method is applied to data on galaxy velocities and U.S. gross national product growth rates, demonstrating its effectiveness in model comparison and selection. The approach is also shown to be useful in assessing the convergence of the Gibbs sampler and in computing the numerical standard error of the marginal likelihood estimate. The method provides a straightforward and computationally efficient way to estimate the marginal likelihood and Bayes factors, making it a valuable tool in Bayesian analysis. The results demonstrate that the method produces accurate and stable estimates, even in complex models with multiple components and latent variables.This paper presents a method for computing the marginal likelihood (marginal density of the data) and Bayes factors from the output of a Gibbs sampler. The marginal likelihood is a key component in Bayesian model comparison, as it represents the normalizing constant of the posterior density. The method described here avoids the computational challenges associated with directly estimating the marginal likelihood, which has traditionally been difficult due to the need for integrating the likelihood function over the prior. The approach leverages the fact that the marginal likelihood can be expressed as the product of the likelihood and the prior divided by the posterior density. By using the posterior draws obtained from the Gibbs sampler, the method estimates the posterior density at a high-density point, which allows for a more accurate and stable estimation of the marginal likelihood. This method is particularly useful in models where the full conditional densities have closed-form expressions, such as in conjugate priors and many applications of the Gibbs sampler. The paper illustrates the method in the context of probit regression and finite mixture models. For probit regression, the method is applied to a dataset on prostate cancer nodal involvement, where the marginal likelihood is estimated for various models. For finite mixture models, the method is applied to data on galaxy velocities and U.S. gross national product growth rates, demonstrating its effectiveness in model comparison and selection. The approach is also shown to be useful in assessing the convergence of the Gibbs sampler and in computing the numerical standard error of the marginal likelihood estimate. The method provides a straightforward and computationally efficient way to estimate the marginal likelihood and Bayes factors, making it a valuable tool in Bayesian analysis. The results demonstrate that the method produces accurate and stable estimates, even in complex models with multiple components and latent variables.
Reach us at info@study.space