Abstract

The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the entropy-power inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the differential entropy under a Gaussian perturbation, and via the convolution inequality J(X+Y)/sup -1//spl ges/J(X)/sup -1/+J(Y)/sup -1/ (for independent X and Y), known as the Fisher information inequality (FII). The FII is proved in the literature directly, in a rather involved way. We give an alternative derivation of the FII, as a simple consequence of a "data processing inequality" for the Cramer-Rao lower bound on parameter estimation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.