Abstract

Let $p_1 , \cdots ,p_n $ be n probabilities that sum to 1. The classical entropy inequality asserts that $\sum np_i \log np_i $ is nonnegative. We show that $np_i $ can be replaced here by $(np_i )^\theta $ where $\theta = 1 + (n - 1)^{ - 1} - (\log n)^{ - 1} $. This is a stronger result, and nearly best possible. For $n = 2$ the best possible result follows from the nonnegativity of the coefficients of a certain class of power series.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call