Abstract
Feature selection aims to find a set of features that are concise and have good generalization capabilities by removing redundant, uncorrelated, and noisy features. Recently, the regularized self-representation (RSR) method was proposed for unsupervised feature selection by minimizing the L2,1 norm of residual matrix and self-representation coefficient matrix. In this paper, we find that minimizing the L2,1 norm of the self-representation coefficient matrix cannot effectively extract the features with strong correlation. Therefore, by adding the minimum constraint on the kernel norm of the self-representation coefficient matrix, a new unsupervised feature selection method named low-rank regularized self-representation (LRRSR) is proposed, which can effectively discover the overall structure of the data. Experiments show that the proposed algorithm has better performance on clustering tasks than RSR and other related algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.