Abstract

We study the geometry of probability distributions with respect to a generalized family of Csiszar f-divergences. A member of this family is the relative $$\alpha $$-entropy which is also a Renyi analog of relative entropy in information theory and known as logarithmic or projective power divergence in statistics. We apply Eguchi’s theory to derive the Fisher information metric and the dual affine connections arising from these generalized divergence functions. This enables us to arrive at a more widely applicable version of the Cramer–Rao inequality, which provides a lower bound for the variance of an estimator for an escort of the underlying parametric probability distribution. We then extend the Amari–Nagaoka’s dually flat structure of the exponential and mixer models to other distributions with respect to the aforementioned generalized metric. We show that these formulations lead us to find unbiased and efficient estimators for the escort model. Finally, we compare our work with prior results on generalized Cramer–Rao inequalities that were derived from non-information-geometric frameworks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call