Abstract
Subspace learning is a popular approach for feature extraction and classification. However, its performance would be heavily degraded when data are corrupted by large amounts of noise. Inspired by recent work in matrix recovery, we tackle this problem by exploiting a subspace that is robust to noise and large variability for classification. Specifically, we propose a novel Supervised Regularization based Robust Subspace (SRRS) approach via low-rank learning. Unlike existing subspace methods, our approach jointly learns low-rank representations and a robust subspace from noisy observations. At the same time, to improve the classification performance, class label information is incorporated as supervised regularization. The problem can then be formulated as a constrained rank minimization objective function, which can be effectively solved by the inexact augmented Lagrange multiplier (ALM) algorithm. Our approach differs from current sparse representation and low-rank learning methods in that it explicitly learns a low-dimensional subspace where the supervised information is incorporated. Extensive experimental results on four datasets demonstrate that our approach outperforms the state-of-the-art subspace and low-rank learning methods in almost all cases, especially when the data contain large variations or are heavily corrupted by noise.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.