Abstract

Traditional ridge regression (RR) utilizing $${L_{2}}$$ -norm as basic measurement is sensitive to outliers and it has the potential risk of overfitting in the computing procedure while dealing with recognition task. Also, the projection number learned by RR is no more than the number of class. LDA is also a well-known method for discriminative feature selection, but the learned projections are limited by the rank of the so-called between-class scatter matrix. In all, both ridge regression and LDA have small-class problem. To solve these problems in both RR and LDA, we propose a method called robust jointly sparse regression (RJSR). RJSR uses $${L_{2,1}}$$ -norm instead of $${L_2}$$ -norm on both loss function and regularization term to guarantee the robustness to outliers and joint sparsity for effective feature selection. In addition, differ from existing $${L_{2,1}}$$ -norm based methods, RJSR incorporates the flexible factor and the robust measurement to guarantee robustness. An alternatively iterative algorithm is designed to compute the optimal solution and the convergence of this algorithm is proved. Experimental evaluation on several well-known data sets shows the merits of the proposed method on feature selection and classification, especially in the case when the face images are corrupted by block noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call