Abstract

With the dramatic increase of dimensions in the data representation, extracting latent low-dimensional features becomes of the utmost importance for efficient classification. Aiming at the problems of weakly discriminating marginal representation and difficulty in revealing the data manifold structure in most of the existing linear discriminant methods, we propose a more powerful discriminant feature extraction framework, namely, joint sparse locality-aware regression (JSLAR). In our model, we formulate a new strategy induced by the nonsquared L2 norm for enhancing the local intraclass compactness of the data manifold, which can achieve the joint learning of the locality-aware graph structure and the desirable projection matrix. Besides, we formulate a weighted retargeted regression to perform the marginal representation learning adaptively instead of using the general average interclass margin. To alleviate the disturbance of outliers and prevent overfitting, we measure the regression term and locality-aware term together with the regularization term by forcing the row sparsity with the joint L2,1 norms. Then, we derive an effective iterative algorithm for solving the proposed model. The experimental results over a range of benchmark databases demonstrate that the proposed JSLAR outperforms some state-of-the-art approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call