Abstract
Feature extraction and feature selection have been regarded as two independent dimensionality reduction methods in most of the existing literature. In this paper, we propose to integrate both approaches into a unified framework and design an unsupervised linear feature selective projection (FSP) for feature extraction with low-rank embedding and dual Laplacian regularization, with the aim to exploit the intrinsic relationship among data and suppress the impact of noise. Specifically, a projection matrix with an $l_{2,1}$ l 2 , 1 -norm regularization is introduced to project original high dimensional data points into a new subspace with lower dimension, where the $l_{2,1}$ l 2 , 1 -norm regularization can endow the projection with good interpretability. We deploy a coefficient matrix with low rank constraint to reconstruct the data points and the $l_{2,1}$ l 2 , 1 -norm is imposed to regularize the data reconstruction errors in the low-dimensional subspace and make FSP robust to noise. Furthermore, a dual graph Laplacian regularization term is imposed on the low dimensional data and data reconstruction matrix for preserving the local manifold geometrical structure of data. Finally, an alternatively iterative algorithm is carefully designed for solving the proposed optimization model. Theoretical convergence and computational complexity analysis of the algorithm are also provided. Comprehensive experiments on various benchmark datasets have been carried out to evaluate the performance of the proposed FSP. As indicated, our algorithm significantly outperforms other state-of-the-art methods for feature extraction.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have