Abstract

We show that the uniform distribution minimizes entropy among all one-dimensional symmetric log-concave distributions with fixed variance, as well as various generalizations of this fact to R\'enyi entropies of orders less than 1 and with moment constraints involving $p$-th absolute moments with $p\leq 2$. As consequences, we give new capacity bounds for additive noise channels with symmetric log-concave noises, as well as for timing channels involving positive signal and noise where the noise has a decreasing log-concave density. In particular, we show that the capacity of an additive noise channel with symmetric, log-concave noise under an average power constraint is at most 0.254 bits per channel use greater than the capacity of an additive Gaussian noise channel with the same noise power. Consequences for reverse entropy power inequalities and connections to the slicing problem in convex geometry are also discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.