Abstract
In the field of information theory, statistics and other application areas, the information-theoretic divergences are used widely. To meet the requirement of metric properties, we introduce a class of new metrics based on triangular discrimination which are bounded. Moreover, we obtain some sharp inequalities for the triangular discrimination and other information-theoretic divergences. Their asymptotic approximation properties are also involved.
Highlights
In many applications such as pattern recognition, machine learning, statistics, optimization and other applied branches of mathematics, it is beneficial to use the information-theoretic divergences rather than the squared Euclidean distance to estimate thesimilarity of two probability distributions or positive arrays [1,2,3,4,5,6,7,8,9]
In paper [12], Endres and Schindelin have proved that the square root of twice Jensen–Shannon divergence is a metric
Triangular discrimination presented by Topsøe in [13] is a non-logarithmic measure and is simple in complex computation
Summary
In many applications such as pattern recognition, machine learning, statistics, optimization and other applied branches of mathematics, it is beneficial to use the information-theoretic divergences rather than the squared Euclidean distance to estimate the (dis)similarity of two probability distributions or positive arrays [1,2,3,4,5,6,7,8,9]. Among them the Kullback–Leibler divergence (relative entropy), triangular discrimination, variation distance, Hellinger distance, Jensen–Shannon divergence, symmetric Chi-square divergence, J-divergence and other important measures often play a critical role. Most of these divergences do not satisfy the metric properties and unboundedness [10]. Triangular discrimination presented by Topsøe in [13] is a non-logarithmic measure and is simple in complex computation. The main result is that a class of new metrics derived from the triangular discrimination are introduced. Some new relationships among triangular discrimination, Jensen–Shannon divergence, square of Hellinger distance, variation distance are obtained
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have