Abstract

We investigate optimal bias corrections in the problem of linear minimum mean square error (LMMSE) estimation of a scalar parameter linearly described by a set of Gaussian multidimensional observations. The problem of finding the optimal scaling of a class of LMMSE filter implementations based on the sample covariance matrix (SCM) is addressed. By applying recent results from random matrix theory, the scaling factor minimizing the mean square error (MSE) and depending on both the unknown covariance matrix and its sample estimator is firstly asymptotically analyzed in terms of key scenario parameters, and finally estimated using the SCM. As a main result, a universal scaling factor minimizing the estimator MSE is obtained which dramatically outperforms the conventional LMMSE filter implementation. A Bayesian setting assuming random unknown parameters with known mean and variance is considered in this paper, but exactly the same methodology applies to the classical estimation setup considering deterministic parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call