Abstract

We investigate the behavior of the mean-square error (MSE) of low-rank and sparse matrix decomposition, in particular the special case of the robust principal component analysis (RPCA), and its generalization matrix completion and correction (MCC). We derive a constrained Cramer-Rao bound (CRB) for any locally unbiased estimator of the low-rank matrix and of the sparse matrix. We analyze the typical behavior of the constrained CRB for MCC where a subset of entries of the underlying matrix are randomly observed, some of which are grossly corrupted. We obtain approximated constrained CRBs by using a concentration of measure argument. We design an alternating minimization procedure to compute the maximum-likelihood estimator (MLE) for the low-rank matrix and the sparse matrix, assuming knowledge of the rank and the sparsity level. For relatively small rank and sparsity level, we demonstrate numerically that the performance of the MLE approaches the constrained CRB when the signal-to-noise-ratio is high. We discuss the implications of these bounds and compare them with the empirical performance of the accelerated proximal gradient algorithm as well as other existing bounds in the literature.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call