Abstract

In continuation to an earlier work, we further consider the problem of robust estimation of a random vector (or signal), with an uncertain covariance matrix, that is observed through a known linear transformation and corrupted by additive noise with a known covariance matrix. While, in the earlier work, we developed and proposed a competitive minimax approach of minimizing the worst-case mean-squared error (MSE) difference regret criterion, here, we study, in the same spirit, the minimum worst-case MSE ratio regret criterion, namely, the worst-case ratio (rather than difference) between the MSE attainable using a linear estimator, ignorant of the exact signal covariance, and the minimum MSE (MMSE) attainable by optimum linear estimation with a known signal covariance. We present the optimal linear estimator, under this criterion, in two ways: The first is as a solution to a certain semidefinite programming (SDP) problem, and the second is as an expression that is of closed form up to a single parameter whose value can be found by a simple line search procedure. We then show that the linear minimax ratio regret estimator can also be interpreted as the MMSE estimator that minimizes the MSE for a certain choice of signal covariance that depends on the uncertainty region. We demonstrate that in applications, the proposed minimax MSE ratio regret approach may outperform the well-known minimax MSE approach, the minimax MSE difference regret approach, and the "plug-in" approach, where in the latter, one uses the MMSE estimator with an estimated covariance matrix replacing the true unknown covariance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call