Abstract

Shannon's Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The EPI is a powerful tool and has been used to resolve a number of problems in information theory. In this paper we examine the existence of a similar entropy inequality for discrete random variables. We obtain an entropy power inequality for random variables taking values in any group of order 2n, i.e. for such a group G we explicitly characterize the function fG(x, y) giving the minimum entropy of the group product of two independent G-valued random variables with respective entropies x and y. Random variables achieving the extremum in this inequality are thus the analogs of Gaussians, and these are also determined. It turns out that fG(x, y) is convex in x for fixed y and, by symmetry, convex in y for fixed x. This is a generalization to groups of order 2n of the result known as Mrs. Gerber's Lemma.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call