THE TIGHT CONSTANT IN THE DVORETZKY-KIEFER-WOLFOWITZ INEQUALITY

THE TIGHT CONSTANT IN THE DVORETZKY-KIEFER-WOLFOWITZ INEQUALITY

1990 | P. MASSART
The paper by P. Massart provides a tight constant in the Dvoretzky-Kiefer-Wolfowitz (DKW) inequality. The DKW inequality bounds the probability that the empirical distribution function $\hat{F}_n$ deviates from the true distribution function $F$ by more than a given $\lambda$. The original inequality had an unspecified constant $C$, but Massart shows that $C$ can be taken as 1 under the condition that $\exp(-2\lambda^2) \leq \frac{1}{2}$. This result is significant because it improves upon previous bounds and confirms a conjecture by Birnbaum and McCarty. The paper also establishes that the two-sided DKW inequality holds without any restriction on $\lambda$, with the constant 2. The constants in the inequality cannot be further improved, as shown by asymptotic results and numerical computations. The main result, Theorem 1, states that for any integer $n$ and any $\lambda \geq \sqrt{\log(2)/2} \gamma n^{-1/6}$, the probability $P(D_n^{-} > \lambda)$ is bounded by $\exp(-2\lambda^2)$. This result is derived using a detailed analysis of the empirical process and the properties of the Brownian bridge. The paper also provides a proof of Theorem 1, which involves bounding the probability using a combination of technical lemmas and inequalities. The proof includes a comparison between the laws of the stopping times of the empirical process and the Brownian bridge, and uses exponential bounds for binomial tails, which are related to the classical inequalities of Hoeffding and Bernstein. The paper concludes with a discussion of the implications of the results, including the tightness of the constants in the DKW inequality and the validity of the results for both one-sided and two-sided tests. The results are important for statistical testing and the analysis of empirical distribution functions.The paper by P. Massart provides a tight constant in the Dvoretzky-Kiefer-Wolfowitz (DKW) inequality. The DKW inequality bounds the probability that the empirical distribution function $\hat{F}_n$ deviates from the true distribution function $F$ by more than a given $\lambda$. The original inequality had an unspecified constant $C$, but Massart shows that $C$ can be taken as 1 under the condition that $\exp(-2\lambda^2) \leq \frac{1}{2}$. This result is significant because it improves upon previous bounds and confirms a conjecture by Birnbaum and McCarty. The paper also establishes that the two-sided DKW inequality holds without any restriction on $\lambda$, with the constant 2. The constants in the inequality cannot be further improved, as shown by asymptotic results and numerical computations. The main result, Theorem 1, states that for any integer $n$ and any $\lambda \geq \sqrt{\log(2)/2} \gamma n^{-1/6}$, the probability $P(D_n^{-} > \lambda)$ is bounded by $\exp(-2\lambda^2)$. This result is derived using a detailed analysis of the empirical process and the properties of the Brownian bridge. The paper also provides a proof of Theorem 1, which involves bounding the probability using a combination of technical lemmas and inequalities. The proof includes a comparison between the laws of the stopping times of the empirical process and the Brownian bridge, and uses exponential bounds for binomial tails, which are related to the classical inequalities of Hoeffding and Bernstein. The paper concludes with a discussion of the implications of the results, including the tightness of the constants in the DKW inequality and the validity of the results for both one-sided and two-sided tests. The results are important for statistical testing and the analysis of empirical distribution functions.
Reach us at info@study.space