Abstract

As the extended version of conventional Ridge Regression, L <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2,1</sub> -norm based ridge regression learning methods have been widely used in subspace learning since they are more robust than Frobenius norm based regression and meanwhile guarantee joint sparsity. However, conventional L <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2,1</sub> -norm regression methods encounter the small-class problem and meanwhile ignore the local geometric structures, which degrade their performances. To address these problems, we propose a novel regression method called Locality Preserving Robust Regression (LPRR). In addition to using the L <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2,1</sub> -norm for jointly sparse regression, we also utilize capped L <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sub> -norm in loss function to further enhance the robustness of the proposed algorithm. Moreover, to make use of local structure information, we also integrate the property of locality preservation into our model since it is of great importance in dimensionality reduction. The convergence analysis and computational complexity of the proposed iterative algorithm are presented. Experimental results on four datasets indicate that the proposed LPRR performs better than some famous subspace learning methods in classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call