Abstract

In the fields of pattern recognition and data mining, two problems need to be addressed. First, the curse of dimensionality degrades the performance of many practical data processing techniques. Second, due to the existence of noise and outliers, feature extraction on corrupted data cannot be effectively achieved. Recently, some representation based methods have produced promising results. However, these methods cannot handle the case in which nonlinear similarity exists and have failed to provide the quantized interpretability for the importance of features. In this paper, we propose a novel low rank and sparse representation method to realize dimensionality reduction and robustly extract latent low dimensional discriminative features. Specifically, we first adopt multiple kernel learning to map the original data into an embedded reproducing kernel Hilbert space (RKHS) and then kernel based similarity discriminative projection is learned to explore the within-class and between-class variability. Notably, this low dimensional feature learning strategy is definitely integrated into the low rank matrix recovery of the kernel matrix. Next, we introduce the regularization of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$l_{2,1}$ </tex-math></inline-formula> norm on error matrix to eliminate noise and on projection matrix to lead the selected features to be more compact and interpretable. The non-convex optimization problem is effectively solved by the alternating direction method of multipliers (ADMM) methods. Extensive experiments on seven benchmark datasets are conducted to demonstrate the effectiveness of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call