Abstract

This work presents strong data processing results for the power-constrained additive Gaussian channel. Explicit bounds on the amount of decrease of mutual information under convolution with Gaussian noise are shown. The analysis leverages the connection between information and estimation (I-MMSE) and the following estimation-theoretic result of independent interest. It is proved that any random variable for which there exists an almost optimal (in terms of the mean-squared error) linear estimator operating on the Gaussian-corrupted measurement must necessarily be almost Gaussian (in terms of the Kolmogorov-Smirnov distance).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call