Received January 2003. Final revision January 2006 | Pierre Del Moral, Arnaud Doucet and Ajay Jasra
This paper introduces a methodology for sequentially sampling from a sequence of probability distributions defined on a common space, each distribution being known up to a normalizing constant. The approach is based on sequential Monte Carlo (SMC) methods, which approximate the target distributions by a cloud of weighted random samples that are propagated over time. This methodology allows for the development of simple algorithms to make parallel Markov chain Monte Carlo (MCMC) algorithms interact, performing global optimization and sequential Bayesian estimation, and computing ratios of normalizing constants. The paper focuses on the algorithmic aspects of SMC samplers, providing a generic sequential importance sampling (SIS) algorithm and discussing its limitations. It then proposes a solution using auxiliary variables and artificial backward Markov kernels to circumvent the calculation of the importance distribution. The resulting algorithm is shown to preserve complexity at $O(N)$ and provides asymptotically consistent estimates. The paper also discusses the selection of optimal and suboptimal backward kernels, and presents convergence results for the estimates of expectations and ratios of normalizing constants. Finally, it illustrates the methodology through an example of Bayesian analysis of finite mixture distributions, demonstrating the benefits of resampling in the SMC framework.This paper introduces a methodology for sequentially sampling from a sequence of probability distributions defined on a common space, each distribution being known up to a normalizing constant. The approach is based on sequential Monte Carlo (SMC) methods, which approximate the target distributions by a cloud of weighted random samples that are propagated over time. This methodology allows for the development of simple algorithms to make parallel Markov chain Monte Carlo (MCMC) algorithms interact, performing global optimization and sequential Bayesian estimation, and computing ratios of normalizing constants. The paper focuses on the algorithmic aspects of SMC samplers, providing a generic sequential importance sampling (SIS) algorithm and discussing its limitations. It then proposes a solution using auxiliary variables and artificial backward Markov kernels to circumvent the calculation of the importance distribution. The resulting algorithm is shown to preserve complexity at $O(N)$ and provides asymptotically consistent estimates. The paper also discusses the selection of optimal and suboptimal backward kernels, and presents convergence results for the estimates of expectations and ratios of normalizing constants. Finally, it illustrates the methodology through an example of Bayesian analysis of finite mixture distributions, demonstrating the benefits of resampling in the SMC framework.