Abstract
The quadratic decaying property of the information rate function states that given a fixed conditional distribution $p_{\mathsf{Y}|\mathsf{X}}$, the mutual information between the (finite) discrete random variables $\mathsf{X}$ and $\mathsf{Y}$ decreases at least quadratically in the Euclidean distance as $p_\mathsf{X}$ moves away from the capacity-achieving input distributions. It is a property of the information rate function that is particularly useful in the study of higher order asymptotics and finite blocklength information theory, where it was already implicitly used by Strassen [1] and later, more explicitly, by Polyanskiy-Poor-Verd\'u [2]. However, the proofs outlined in both works contain gaps that are nontrivial to close. This comment provides an alternative, complete proof of this property.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.