Sequential Monte Carlo samplers

Sequential Monte Carlo samplers

2006 | Pierre Del Moral, Arnaud Doucet and Ajay Jasra
This paper proposes a methodology for sequentially sampling from a sequence of probability distributions defined on a common space, each of which is known up to a normalizing constant. The approach uses sequential Monte Carlo (SMC) methods to approximate these distributions with a cloud of weighted random samples, which are propagated over time. This methodology allows for the development of algorithms that can perform global optimization, sequential Bayesian estimation, and compute ratios of normalizing constants. The paper discusses various applications of SMC samplers in Bayesian inference, including sequential Bayesian inference, optimization, and rare event probability estimation. It also presents a generic SMC algorithm that can be used to sample from a fixed target distribution by incorporating MCMC kernels or approximate Gibbs steps. The paper highlights the advantages of SMC samplers over traditional MCMC methods, particularly in terms of computational efficiency and the ability to handle complex, high-dimensional distributions. The methodology is illustrated through examples in Bayesian analysis of finite mixture distributions and sequential Bayesian inference. The paper also provides convergence results for SMC samplers, showing that they can provide asymptotically consistent estimates of expectations and normalizing constants. The paper concludes with a discussion of the connections between SMC samplers and other methods, such as importance sampling, resampling, and simulated annealing.This paper proposes a methodology for sequentially sampling from a sequence of probability distributions defined on a common space, each of which is known up to a normalizing constant. The approach uses sequential Monte Carlo (SMC) methods to approximate these distributions with a cloud of weighted random samples, which are propagated over time. This methodology allows for the development of algorithms that can perform global optimization, sequential Bayesian estimation, and compute ratios of normalizing constants. The paper discusses various applications of SMC samplers in Bayesian inference, including sequential Bayesian inference, optimization, and rare event probability estimation. It also presents a generic SMC algorithm that can be used to sample from a fixed target distribution by incorporating MCMC kernels or approximate Gibbs steps. The paper highlights the advantages of SMC samplers over traditional MCMC methods, particularly in terms of computational efficiency and the ability to handle complex, high-dimensional distributions. The methodology is illustrated through examples in Bayesian analysis of finite mixture distributions and sequential Bayesian inference. The paper also provides convergence results for SMC samplers, showing that they can provide asymptotically consistent estimates of expectations and normalizing constants. The paper concludes with a discussion of the connections between SMC samplers and other methods, such as importance sampling, resampling, and simulated annealing.
Reach us at info@study.space