Abstract

Entropy and relative entropy measures play a crucial role in mathematical information theory. The relative entropies are also widely used in statistics under the name of divergence measures which link these two fields of science through the minimum divergence principle. Divergence measures are popular among statisticians as many of the corresponding minimum divergence methods lead to robust inference in the presence of outliers in the observed data; examples include the -divergence, the density power divergence, the logarithmic density power divergence and the recently developed family of logarithmic super divergence (LSD). In this paper, we will present an alternative information theoretic formulation of the LSD measures as a two-parameter generalization of the relative -entropy, which we refer to as the general -entropy. We explore its relation with various other entropies and divergences, which also generates a two-parameter extension of Renyi entropy measure as a by-product. This paper is primarily focused on the geometric properties of the relative -entropy or the LSD measures; we prove their continuity and convexity in both the arguments along with an extended Pythagorean relation under a power-transformation of the domain space. We also derive a set of sufficient conditions under which the forward and the reverse projections of the relative -entropy exist and are unique. Finally, we briefly discuss the potential applications of the relative -entropy or the LSD measures in statistical inference, in particular, for robust parameter estimation and hypothesis testing. Our results on the reverse projection of the relative -entropy establish, for the first time, the existence and uniqueness of the minimum LSD estimators. Numerical illustrations are also provided for the problem of estimating the binomial parameter.

Highlights

  • Decision making under uncertainty is the backbone of modern information science

  • This paper is primarily focused on the geometric properties of the relative (α, β)-entropy or the logarithmic super divergence (LSD) measures; we prove their continuity and convexity in both the arguments along with an extended Pythagorean relation under a power-transformation of the domain space

  • We study the geometric properties of all members of the relative (α, β)-entropy family, or equivalently the LSD measures, including their continuity in both the arguments and a Pythagorean-type relation

Read more

Summary

Introduction

Decision making under uncertainty is the backbone of modern information science. The works of C. One needs to be very careful in discriminating the application of the newly introduced entropies and divergence measures for the purposes of inference under given information, from the ones where it is used as a measure of complexity In this respect, we would like to emphasize that, the main advantage of our two-parameter extended family of LSD or relative (α, β)-entropy measures in parametric statistical inference is in their strong robustness property against possible contamination (generally manifested through outliers) in the sample data. Another important issue could be to decide whether to stop at the two-parameter level for information measures or to extend it to three-parameters, four-parameters, etc. Since it is a known principle that one “should not multiply entities beyond necessity”, we will, for the sake of parsimony, restrict ourselves to the second level of generalization for robust statistical inference, at least until there is further convincing evidence that the higher level of generalization can produce a significant improvement

Definition
Relations with Different Existing or New Entropies and Divergences
Continuity
Convexity
Extended Pythagorean Relation
The Reverse Projection and Parametric Estimation
Numerical Illustration
Application to Testing Statistical Hypothesis
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call