Abstract

The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar -Jensen–Bregman divergences and derive thereof the vector-skew -Jensen–Shannon divergences. We prove that the vector-skew -Jensen–Shannon divergences are f-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms.

Highlights

  • Let (X, F, μ) be a measure space [1] where X denotes the sample space, F the σ-algebra of measurable events, and μ a positive measure; for example, the measure space defined by the Lebesgue measure μ L with Borel σ-algebra B(Rd ) for X = Rd or the measure space defined by the counting measure μc with the power set σ-algebra 2X on a finite alphabet X

  • We report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms

  • We prove that weighted vector-skew Jensen–Shannon divergences are f -divergences (Theorem 1), and show how to build families of symmetric Jensen–Shannon-type divergences which can be controlled by a vector of parameters in Section 2.3, generalizing the work of [20] from scalar skewing to vector skewing

Read more

Summary

Introduction

The asymmetric α-skew Jensen–Shannon divergence can be defined for a scalar parameter α ∈ (0, 1) by considering the weighted mixture ( pq)α as follows: JSαa ( p : q). This yields a generalization of the symmetric skew α-Jensen–Shannon divergences to a vector-skew parameter This extension retains the key properties for being upper-bounded and for application to densities with potentially different supports. We prove that weighted vector-skew Jensen–Shannon divergences are f -divergences (Theorem 1), and show how to build families of symmetric Jensen–Shannon-type divergences which can be controlled by a vector of parameters, generalizing the work of [20] from scalar skewing to vector skewing This may prove useful in applications by providing additional tuning parameters (which can be set, for example, by using cross-validation techniques). The experimental results graphically compare the Jeffreys centroid with the Jensen–Shannon centroid for grey-valued image histograms

Vector-Skew Jensen–Bregman Divergences and Jensen Diversities
Vector-Skew Jensen–Shannon Divergences
Building Symmetric Families of Vector-Skewed Jensen–Shannon Divergences many
Mixture Families and Jensen–Shannon Divergences
Jensen–Shannon Centroids
Jensen–Shannon Centroids of Categorical Distributions
Special Cases
Conclusions and Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call