Abstract

Inverse covariance matrix, a.k.a. precision matrix, has wide applications in signal processing and is often estimated from training samples. The quality of estimation can be poor when the sample support is low. Banding/tapering are effective regularization approaches for covariance and precision matrix estimation but the bandwidth must be properly chosen. This paper investigates the bandwidth selection problem for banding/tapering-based precision matrix estimation. Exploiting a regression analysis interpretation of the precision matrix, we design a data-driven cross-validation (CV) method for automatically tuning the bandwidth. The effectiveness of the proposed method is demonstrated by numerical examples under a quadratic loss.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call