Abstract
The performance of machine learning algorithms can be affected by redundant features of high-dimensional data. Furthermore, these irrelevant features increase the time of computation for learning model. These problems can be addressed by leveraging different techniques including feature selection and dimensionality reduction. Unsupervised feature selection has drawn increased attention due to the difficulty of label collection for supervised feature selection. To this end, we developed an innovative approach based on nonnegative matrix factorization (NMF) to remove redundant information. In this technique, for the first time, the local information preserving regularization and global information preserving regularization are applied for both feature weight matrix and representation matrix which is why we called Dual-Dual regularized feature selection. Furthermore, Schatten p-norm is utilized to extract inherent low-rank properties of data. To demonstrate the effectiveness of the proposed method, experimental studies are conducted on six benchmark datasets. The computational results show that the proposed method in comparison with state-of-the-art unsupervised feature selection techniques is more efficient for feature selection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Physica A: Statistical Mechanics and its Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.