Abstract

Feature selection has been a research hotspot in many fields. Models based on graph learning are currently the most popular approaches. However, the sparsity of most models is not strong, and graph learning for pair-sample evaluation takes a lot of time. ℓ2,1-norm regularization is the sparsity strategy adopted in most sparse models at present since the convex function is easy to solve. Nevertheless, the sparsity of ℓ2,1-norm is insufficient, and there exist parameter adjustment problems. ℓ2,0-norm is a better choice, which can strengthen the sparse constraints of the subspace. In this paper, the Sparse feature selection via Fast Embedding Spectral Analysis (SFESA) is proposed.Firstly, an adaptive anchor nearest neighbor graph is constructed to avoid the high time cost of learning pairwise nearest neighbor graphs to a certain extent. The low-dimensional embedding of data manifold structure is maintained by performing spectral analysis for the constructed graph. Secondly, the projected data is approximated to the low-dimensional embedding structure via a regularization term. Finally, ℓ2,0-norm is employed to constrain the projection matrix to enhance the subspace sparsity. Furthermore, a fast iterative algorithm is presented to solve this non-convex optimization problem. Extensive experiments on multiple public datasets show that SFESA can obtain excellent performance in less time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.