April 20, 1998 | Adrian E. Raftery, David Madigan, Jennifer A. Hoeting
This paper presents Bayesian Model Averaging (BMA) for linear regression models, which accounts for model uncertainty by averaging over all possible models when making inferences about quantities of interest. Conditioning on a single selected model ignores model uncertainty, leading to underestimation of uncertainty. BMA provides a Bayesian solution by averaging over all models, resulting in better predictive performance than any single model. Two alternative approaches are introduced: "Occam's Window," which selects a small set of models for averaging, and a Markov chain Monte Carlo (MCMC) approach that directly approximates the exact solution. Both methods improve predictive performance compared to single models.
In cases with many predictors and no relationship to the response, standard variable selection methods often select models with high R² and significant F values, but Occam's Window typically indicates the null model as the best, resolving the issue of selecting significant models in the absence of a signal. Software for implementing these methods is available.
The paper discusses the Bayesian framework for linear regression, including prior distributions and model averaging approaches. It describes the use of a normal-gamma conjugate prior and the computation of marginal likelihoods. The paper also presents an example using crime data, where model averaging provides more accurate predictions than single models. It compares the performance of Occam's Window and standard variable selection methods, showing that model averaging improves predictive coverage and calibration.
Simulated examples demonstrate that BMA can improve predictive performance when there is model uncertainty, but not when there is little uncertainty. The paper also discusses the identification of the null model in the presence of no signal, showing that Occam's Window correctly identifies the null model in such cases. The paper concludes that BMA is a valuable approach for accounting for model uncertainty in linear regression, with different methods suitable for different applications.This paper presents Bayesian Model Averaging (BMA) for linear regression models, which accounts for model uncertainty by averaging over all possible models when making inferences about quantities of interest. Conditioning on a single selected model ignores model uncertainty, leading to underestimation of uncertainty. BMA provides a Bayesian solution by averaging over all models, resulting in better predictive performance than any single model. Two alternative approaches are introduced: "Occam's Window," which selects a small set of models for averaging, and a Markov chain Monte Carlo (MCMC) approach that directly approximates the exact solution. Both methods improve predictive performance compared to single models.
In cases with many predictors and no relationship to the response, standard variable selection methods often select models with high R² and significant F values, but Occam's Window typically indicates the null model as the best, resolving the issue of selecting significant models in the absence of a signal. Software for implementing these methods is available.
The paper discusses the Bayesian framework for linear regression, including prior distributions and model averaging approaches. It describes the use of a normal-gamma conjugate prior and the computation of marginal likelihoods. The paper also presents an example using crime data, where model averaging provides more accurate predictions than single models. It compares the performance of Occam's Window and standard variable selection methods, showing that model averaging improves predictive coverage and calibration.
Simulated examples demonstrate that BMA can improve predictive performance when there is model uncertainty, but not when there is little uncertainty. The paper also discusses the identification of the null model in the presence of no signal, showing that Occam's Window correctly identifies the null model in such cases. The paper concludes that BMA is a valuable approach for accounting for model uncertainty in linear regression, with different methods suitable for different applications.