Abstract

The linear minimum mean-square error estimator (LMMSE) can be viewed as a solution to a certain regularized least-squares problem formulated using model covariance matrices. However, the appropriate parameters of the model covariance matrices are unknown in many applications. This raises the question: how should we choose them using only the data? Using data-adaptive matrices obtained via the covariance fitting SPICE-methodology, we show that the empirical LMMSE is equivalent to tuned versions of various known regularized estimators – such as ridge regression, LASSO, and regularized least absolute deviation – depending on the chosen covariance structures. These theoretical results unify several important estimators under a common umbrella. Furthermore, through a number of numerical examples we show that the regularization parameters obtained via covariance fitting are close to optimal for a range of different signal conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call