Abstract

If N is standard Gaussian, the minimum mean square error (MMSE) of estimating a random variable X based on √( <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">snr</i> ) <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">X</i> + <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">N</i> vanishes at least as fast as 1/ <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">snr</i> as <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">snr</i> → ∞. We define the MMSE dimension of X as the limit as <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">snr</i> → ∞ of the product of snr and the MMSE. MMSE dimension is also shown to be the asymptotic ratio of nonlinear MMSE to linear MMSE. For discrete, absolutely continuous or mixed distribution we show that MMSE dimension equals Rényi's information dimension. However, for a class of self-similar singular X (e.g., Cantor dis tribution), we show that the product of snr and MMSE oscillates around information dimension periodically in snr (dB). We also show that these results extend considerably beyond Gaussian noise under various technical conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call