Abstract

Principal component analysis (PCA) is widely used for high-dimensional data analysis, with well-documented applications in computer vision, preference measurement, and bioinformatics. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify PCA against outliers. A least-trimmed squares estimator of a low-rank component analysis model is shown closely related to that obtained from an l 0 -(pseudo)norm-regularized criterion encouraging sparsity in a matrix explicitly modeling the outliers. This connection suggests efficient (approximate) solvers based on convex relaxation, which lead naturally to a family of robust estimators subsuming Huber's optimal M-class. Outliers are identified by tuning a regularization parameter, which amounts to controlling the sparsity of the outlier matrix along the whole robustification path of (group)-Lasso solutions. Novel algorithms are developed to: i) estimate the low-rank data model both robustly and adaptively; and ii) determine principal components robustly in (possibly) infinite-dimensional feature spaces. Numerical tests corroborate the effectiveness of the proposed robust PCA scheme for a video surveillance task.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.