Abstract

Least Squares Regression (LSR) is a powerful method for learning the transform from high-dimensional space into a low-dimensional label subspace. Due to the excellent performance of low-rank matrix decomposition based methods in exploring low-dimensional subspace structure, many extended algorithms of LSR impose low-rank constraints to make intra-class regression targets more comparative and similar. However, the low-rank constraints imposed on the original space may destroy the data structure and adversely affect the model of capturing the manifold structure. In this paper, a Denoising Low-Rank Discrimination based Least Squares Regression (DLRDLSR) model is proposed to eliminate noise in label space. Firstly, we decompose the data into a low-rank matrix and a sparse matrix in label subspace. Secondly, some constraints are imposed onto the low-rank matrix and sparse matrix to preserve the details, and then we use the low-rank matrix for classification. Moreover, ∊-dragging technique is introduced to enlarge the distance between different classes to enhance discrimination, and l2-norm constraint is also introduced to avoid overfitting. The experiments on a variety of image databases demonstrate that the proposed method is superior to other state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call