December 23, 2003 | Paul Marjoram*, John Molitor*, Vincent Plagnol†, and Simon Tavaré**
The article presents a Markov chain Monte Carlo (MCMC) method for generating observations from a posterior distribution without the need for likelihood functions, which are often difficult or computationally expensive to obtain. The method is particularly useful in frequentist applications, such as maximum-likelihood estimation, and is illustrated through an example in population genetics. The authors compare this MCMC approach with rejection methods and approximate Bayesian computation (ABC) methods, highlighting their advantages and limitations. They demonstrate that the MCMC method can provide a good approximation to the true posterior distribution, even when likelihoods are not available. The discussion emphasizes the importance of choosing appropriate summary statistics to improve the accuracy of the posterior estimates and suggests further research in this area.The article presents a Markov chain Monte Carlo (MCMC) method for generating observations from a posterior distribution without the need for likelihood functions, which are often difficult or computationally expensive to obtain. The method is particularly useful in frequentist applications, such as maximum-likelihood estimation, and is illustrated through an example in population genetics. The authors compare this MCMC approach with rejection methods and approximate Bayesian computation (ABC) methods, highlighting their advantages and limitations. They demonstrate that the MCMC method can provide a good approximation to the true posterior distribution, even when likelihoods are not available. The discussion emphasizes the importance of choosing appropriate summary statistics to improve the accuracy of the posterior estimates and suggests further research in this area.