Abstract
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P, the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of this result for the Renyi entropy and the Tsallis entropy. Lower bounds on the difference in the Renyi (or Tsallis) entropy are given in terms of a new divergence which is related to the Renyi (or Tsallis) divergence. This paper also considers a notion of generalized mutual information, namely α-mutual information, which is defined through the Renyi divergence. The convexity/concavity for different ranges of α is shown. A sufficient condition for the Schur concavity is discussed and upper bounds on α-mutual information are given in terms of the Renyi entropy.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have