Abstract
When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing function for Gaussian channels. Among other consequences, this ‘strong’ EPI generalizes the vector extension of Costa’s EPI to non-Gaussian channels in a precise sense. This leads to a new reverse EPI and, as a corollary, sharpens Stam’s uncertainty principle relating entropy power and Fisher information (or, equivalently, Gross’ logarithmic Sobolev inequality). Applications to network information theory are also given, including a short self-contained proof of the rate region for the two-encoder quadratic Gaussian source coding problem and a new outer bound for the one-sided Gaussian interference channel.
Accepted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have