Abstract

The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon.

Highlights

  • As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions

  • The paper is organized as follows: Section 2 reports the special case of mixture families in information geometry [18] for which the Jensen–Shannon divergence can be expressed as a Bregman divergence (Theorem 1), and highlight the lack of closed-form formula when considering exponential families

  • Our motivation to introduce these novel families of M-Jensen–Shannon divergences is to obtain closed-form formula when probability densities belong to some given parametric families PΘ

Read more

Summary

Kullback–Leibler Divergence and Its Symmetrizations

Let (X , A) be a measurable space [1] where X denotes the sample space and A the σ-algebra of measurable events. The symmetrization of the KLD may be obtained using the harmonic mean instead of the arithmetic mean, yielding the resistor average distance [5] R( p; q): R( p; q) Another famous symmetrization of the KLD is the Jensen–Shannon Divergence [6] (JSD) defined by: JS( p; q). This distance can be interpreted as the total divergence to the average distribution (see Equation (10)).

Statistical Distances and Parameter Divergences
J-Symmetrization and JS-Symmetrization of Distances
Contributions and Paper Outline
Jensen–Shannon Divergence in Mixture and Exponential Families
Generalized Jensen–Shannon Divergences
Some Closed-Form Formula for the M-Jensen–Shannon Divergences
The Geometric G-Jensen–Shannon Divergence
Case Study
Applications to k-Means Clustering
The M-Jensen–Shannon Matrix Distances
Conclusions and Perspectives
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call