Abstract

This paper concerns large covariance matrix estimation via composite minimization under the assumption of low rank plus sparse structure. In this approach, the low rank plus sparse decomposition of the covariance matrix is recovered by least squares minimization under nuclear norm plus l1 norm penalization. The objective is minimized via a singular value thresholding plus soft thresholding algorithm. This paper proposes a new estimator based on an additional least-squares re-optimization step aimed at un-shrinking the eigenvalues of the low rank component estimated in the first step. We prove that such un-shrinkage causes the final estimate to approach the target as closely as possible in spectral and Frobenius norm, while recovering exactly the underlying low rank and sparsity pattern. The error bounds are derived imposing that the latent eigenvalues scale to pα and the maximum number of non-zeros per row in the sparse component scales to pδ, where p is the dimension, α∈[0,1], δ∈[0,0.5], and δ<α. The sample size n is imposed to scale at least to p1.5δ. The resulting estimator is called UNALCE (UNshrunk ALgebraic Covariance Estimator), and it is shown to outperform state-of-the-art estimators, especially for what concerns fitting properties and sparsity pattern detection. The effectiveness of UNALCE is highlighted by a real example regarding ECB (European Central Bank) banking supervisory data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call