24 Apr 2014 | Tim van Erven, Peter Harremoës, Member, IEEE
Rényi divergence is a generalization of Kullback-Leibler divergence, similar to how Rényi entropy generalizes Shannon entropy. It is defined for any order $\alpha \neq 1$ and depends on a parameter called the order. When $\alpha = 1$, Rényi divergence reduces to Kullback-Leibler divergence. This paper reviews and extends the properties of Rényi divergence and Kullback-Leibler divergence, including convexity, continuity, and limits involving $\sigma$-algebras. It also discusses the special case of order 0, which relates to absolute continuity and contiguity, and order $\infty$, which is connected to the worst-case regret in coding. The paper generalizes the Pythagorean inequality to any order $\alpha$ and extends known results about channel capacity and minimax redundancy to continuous distributions. Rényi divergence is also related to other measures such as the Bhattacharyya distance, Hellinger distance, and $\chi^2$-divergence. The paper provides a detailed treatment of Rényi divergence, including its definition for continuous spaces, its behavior as a function of $\alpha$, and its applications in hypothesis testing, coding, and information theory. It also discusses the continuity of Rényi divergence in its order and its relationship to other divergence measures. The paper concludes with a discussion of the properties of Rényi divergence for fixed orders and its applications in various areas of information theory.Rényi divergence is a generalization of Kullback-Leibler divergence, similar to how Rényi entropy generalizes Shannon entropy. It is defined for any order $\alpha \neq 1$ and depends on a parameter called the order. When $\alpha = 1$, Rényi divergence reduces to Kullback-Leibler divergence. This paper reviews and extends the properties of Rényi divergence and Kullback-Leibler divergence, including convexity, continuity, and limits involving $\sigma$-algebras. It also discusses the special case of order 0, which relates to absolute continuity and contiguity, and order $\infty$, which is connected to the worst-case regret in coding. The paper generalizes the Pythagorean inequality to any order $\alpha$ and extends known results about channel capacity and minimax redundancy to continuous distributions. Rényi divergence is also related to other measures such as the Bhattacharyya distance, Hellinger distance, and $\chi^2$-divergence. The paper provides a detailed treatment of Rényi divergence, including its definition for continuous spaces, its behavior as a function of $\alpha$, and its applications in hypothesis testing, coding, and information theory. It also discusses the continuity of Rényi divergence in its order and its relationship to other divergence measures. The paper concludes with a discussion of the properties of Rényi divergence for fixed orders and its applications in various areas of information theory.