General state space Markov chains and MCMC algorithms

General state space Markov chains and MCMC algorithms

Vol. 1 (2004) 20-71 | Gareth O. Roberts, Jeffrey S. Rosenthal
This paper surveys various results about Markov chains on general (non-countable) state spaces, focusing on Markov chain Monte Carlo (MCMC) algorithms. It begins with an introduction to MCMC, which provides the motivation for the theory. The paper presents sufficient conditions for geometric and uniform ergodicity, along with quantitative bounds on the rate of convergence to stationarity. These results are often proved using direct coupling constructions based on minorisation and drift conditions. Necessary and sufficient conditions for Central Limit Theorems (CLTs) are also discussed, sometimes proved via the Poisson Equation or direct regeneration constructions. The paper also discusses optimal scaling and weak convergence results for Metropolis-Hastings algorithms. None of the results are new, though many proofs are. Open Problems are also described. MCMC algorithms are used to approximate sampling from complicated probability distributions in high dimensions. They involve Markov chains with a stationary distribution, and the goal is to understand the convergence of the chain to this distribution. The paper discusses the construction of MCMC algorithms, including the Metropolis-Hastings algorithm and the Gibbs sampler. It also addresses convergence rates, total variation distance, and asymptotic convergence. The paper concludes with a discussion of uniform ergodicity and geometric ergodicity, which are weaker forms of convergence. The paper emphasizes the importance of aperiodicity and irreducibility in ensuring convergence and provides conditions for these properties. It also discusses the implications of periodicity and the use of shift-coupling for quantitative bounds on convergence. The paper concludes with a discussion of the importance of these results in the context of MCMC algorithms and their applications in Bayesian statistics.This paper surveys various results about Markov chains on general (non-countable) state spaces, focusing on Markov chain Monte Carlo (MCMC) algorithms. It begins with an introduction to MCMC, which provides the motivation for the theory. The paper presents sufficient conditions for geometric and uniform ergodicity, along with quantitative bounds on the rate of convergence to stationarity. These results are often proved using direct coupling constructions based on minorisation and drift conditions. Necessary and sufficient conditions for Central Limit Theorems (CLTs) are also discussed, sometimes proved via the Poisson Equation or direct regeneration constructions. The paper also discusses optimal scaling and weak convergence results for Metropolis-Hastings algorithms. None of the results are new, though many proofs are. Open Problems are also described. MCMC algorithms are used to approximate sampling from complicated probability distributions in high dimensions. They involve Markov chains with a stationary distribution, and the goal is to understand the convergence of the chain to this distribution. The paper discusses the construction of MCMC algorithms, including the Metropolis-Hastings algorithm and the Gibbs sampler. It also addresses convergence rates, total variation distance, and asymptotic convergence. The paper concludes with a discussion of uniform ergodicity and geometric ergodicity, which are weaker forms of convergence. The paper emphasizes the importance of aperiodicity and irreducibility in ensuring convergence and provides conditions for these properties. It also discusses the implications of periodicity and the use of shift-coupling for quantitative bounds on convergence. The paper concludes with a discussion of the importance of these results in the context of MCMC algorithms and their applications in Bayesian statistics.
Reach us at info@study.space