Abstract

Chebyshev center computation problem, i.e. finding the point which is at minimum distance to a set of given points, on the probability simplex with $\alpha$ -divergence distance measure is studied. The proposed solution generalizes the Arimoto-Blahut (AB) algorithm utilizing Kullback-Leibler divergence to $\alpha$ -divergence, and reduces to the AB method as $\alpha \rightarrow 1$ . Similar to the AB algorithm, the method is an ascent method with a guarantee on the objective value ( $\alpha$ -mutual information or Chebyshev radius) improvement at every iteration. A practical application area for the method is the fusion of probability mass functions lacking a joint probability description. Another application area is the error exponent calculation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call