ON THE RATE OF CONVERGENCE IN WASSERSTEIN DISTANCE OF THE EMPIRICAL MEASURE

ON THE RATE OF CONVERGENCE IN WASSERSTEIN DISTANCE OF THE EMPIRICAL MEASURE

7 Dec 2013 | NICOLAS FOURNIER AND ARNAUD GUILLIN
The paper by Fournier and Guillin focuses on the rate of convergence of the empirical measure $\mu_N$ to the true distribution $\mu$ in the Wasserstein distance $\mathcal{W}_p$ for $p > 0$. They provide non-asymptotic $L^p$ bounds and concentration inequalities for any $p > 0$ and $d \geq 1$. The authors extend these results to stationary $\rho$-mixing sequences, Markov chains, and interacting particle systems. The main contributions include: 1. **Non-asymptotic $L^p$ Bounds**: They derive bounds on the expected value of the Wasserstein distance $\mathcal{T}_p(\mu_N, \mu)$, showing that for a given probability measure $\mu$, there exists a constant $C$ such that $\mathbb{E}(\mathcal{T}_p(\mu_N, \mu)) \leq C M_q^{p/q}(\mu) \{ N^{-1/2} + N^{-(q-p)/q} \}$ for $p \geq d/2$ and $q \neq 2p$, and similar bounds for other cases. 2. **Concentration Inequalities**: They provide concentration inequalities for the Wasserstein distance, showing that for a given probability measure $\mu$, there exist constants $a(N, x)$ and $b(N, x)$ such that $\Pr(\mathcal{T}_p(\mu_N, \mu) \geq x) \leq a(N, x) + b(N, x)$ for all $N \geq 1$ and $x > 0$. The bounds are expressed in terms of moment conditions on the measure $\mu$. 3. **Applications**: The results are applied to various contexts, including Markov chains, $\rho$-mixing sequences, and interacting particle systems. They also discuss the implications for quantization, optimal matching, density estimation, clustering, and MCMC methods. The paper relies on recent ideas from Dereich-Scheutzow-Schottstedt and provides sharp estimates that are useful for understanding the convergence rate of empirical measures in various statistical and probabilistic applications.The paper by Fournier and Guillin focuses on the rate of convergence of the empirical measure $\mu_N$ to the true distribution $\mu$ in the Wasserstein distance $\mathcal{W}_p$ for $p > 0$. They provide non-asymptotic $L^p$ bounds and concentration inequalities for any $p > 0$ and $d \geq 1$. The authors extend these results to stationary $\rho$-mixing sequences, Markov chains, and interacting particle systems. The main contributions include: 1. **Non-asymptotic $L^p$ Bounds**: They derive bounds on the expected value of the Wasserstein distance $\mathcal{T}_p(\mu_N, \mu)$, showing that for a given probability measure $\mu$, there exists a constant $C$ such that $\mathbb{E}(\mathcal{T}_p(\mu_N, \mu)) \leq C M_q^{p/q}(\mu) \{ N^{-1/2} + N^{-(q-p)/q} \}$ for $p \geq d/2$ and $q \neq 2p$, and similar bounds for other cases. 2. **Concentration Inequalities**: They provide concentration inequalities for the Wasserstein distance, showing that for a given probability measure $\mu$, there exist constants $a(N, x)$ and $b(N, x)$ such that $\Pr(\mathcal{T}_p(\mu_N, \mu) \geq x) \leq a(N, x) + b(N, x)$ for all $N \geq 1$ and $x > 0$. The bounds are expressed in terms of moment conditions on the measure $\mu$. 3. **Applications**: The results are applied to various contexts, including Markov chains, $\rho$-mixing sequences, and interacting particle systems. They also discuss the implications for quantization, optimal matching, density estimation, clustering, and MCMC methods. The paper relies on recent ideas from Dereich-Scheutzow-Schottstedt and provides sharp estimates that are useful for understanding the convergence rate of empirical measures in various statistical and probabilistic applications.
Reach us at info@study.space