Abstract

Least squares regression and ridge regression are simple and effective methods for feature selection and classification and many methods based on them are proposed. However, most of these methods have small-class problem, which means that the number of the projection learned by these methods is limited by the number of class. In this paper, we propose a jointly sparse reconstructed regression (JSRR) to solve this problem. Moreover, JSRR uses L2,1-norm as the basic measurement so that it can enhance robustness to outliers and guarantee joint sparsity for discriminant feature selection. In addition, by integrating the property of robust feature selection (RFS) and principle component analysis (PCA), JSRR is able to obtain the projections that have minimum reconstructed error and strong discriminability for recognition task. We also propose an iterative algorithm to solve the optimization problem. A series of experiments are conducted to evaluate the performance of JSRR. Experimental results indicate that JSRR outperforms the classical RR and some state-of-the-art regression methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call