Abstract
Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback–Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is defined implicitly via a system of non-linear equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound provides an alternative to well-known inequalities in estimation and information theory—such as the Cramer–Rao lower bound, Stam's inequality, or the entropy power inequality—that is potentially tighter and defined for a larger class of input distributions. Several examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have