This paper introduces a new class of information-theoretic divergence measures based on Shannon entropy. Unlike the well-known Kullback divergences, these new measures do not require the probability distributions to be absolutely continuous. The paper establishes a close relationship between these measures and the variational distance and the probability of misclassification error, providing bounds that are crucial in many applications of information theory. The new measures are characterized by nonnegativity, finiteness, semiboundedness, and boundedness.
The paper also proposes an extension of the Jensen-Shannon divergence, which allows for different weights to be assigned to each probability distribution. This extension is particularly useful in decision-making problems where prior probabilities play a role. The generalized Jensen-Shannon divergence is shown to provide both lower and upper bounds for the Bayes probability of misclassification error.
Additionally, the paper discusses the properties of the new divergence measures, including their nonnegativity, finiteness, semiboundedness, and boundedness. The measures are compared with the $I$ and $J$ divergences, showing that they have similar properties but with some advantages, such as being well-defined for distributions that are not absolutely continuous.
The paper concludes by highlighting the theoretical foundation and potential applications of these divergence measures, emphasizing their importance in various fields such as signal processing, pattern recognition, and taxonomy in biology and genetics.This paper introduces a new class of information-theoretic divergence measures based on Shannon entropy. Unlike the well-known Kullback divergences, these new measures do not require the probability distributions to be absolutely continuous. The paper establishes a close relationship between these measures and the variational distance and the probability of misclassification error, providing bounds that are crucial in many applications of information theory. The new measures are characterized by nonnegativity, finiteness, semiboundedness, and boundedness.
The paper also proposes an extension of the Jensen-Shannon divergence, which allows for different weights to be assigned to each probability distribution. This extension is particularly useful in decision-making problems where prior probabilities play a role. The generalized Jensen-Shannon divergence is shown to provide both lower and upper bounds for the Bayes probability of misclassification error.
Additionally, the paper discusses the properties of the new divergence measures, including their nonnegativity, finiteness, semiboundedness, and boundedness. The measures are compared with the $I$ and $J$ divergences, showing that they have similar properties but with some advantages, such as being well-defined for distributions that are not absolutely continuous.
The paper concludes by highlighting the theoretical foundation and potential applications of these divergence measures, emphasizing their importance in various fields such as signal processing, pattern recognition, and taxonomy in biology and genetics.