Abstract

Divergences, also known as contrast functions, are distance-like quantities defined on manifolds of non-negative or probability measures. Using the duality in optimal transport, we introduce and study the one-parameter family of $$L^{(\pm \alpha )}$$ -divergences. They extrapolate between the Bregman divergence corresponding to the Euclidean quadratic cost, and the L-divergence introduced by Pal and the author in connection with portfolio theory and a logarithmic cost function. They admit natural generalizations of exponential family that are closely related to the $$\alpha $$ -family and q-exponential family. In particular, the $$L^{(\pm \alpha )}$$ -divergences of the corresponding potential functions are Renyi divergences. Using this unified framework we prove that the induced geometries are dually projectively flat with constant sectional curvatures, and a generalized Pythagorean theorem holds true. Conversely, we show that if a statistical manifold is dually projectively flat with constant curvature $$\pm \alpha $$ with $$\alpha > 0$$ , then it is locally induced by an $$L^{(\mp \alpha )}$$ -divergence. We define in this context a canonical divergence which extends the one for dually flat manifolds.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call