Abstract
Curse of dimensionality is known as big challenges in data mining, pattern recognition, computer vison and machine learning in recent years. Feature selection and feature extraction are two main approaches to circumvent this challenge. The main objective in feature selection is to remove the redundant features and preserve the relevant features in order to improve the learning algorithm performance. This survey provides a comprehensive overview of state-of-art feature selection techniques including mathematical formulas and fundamental algorithm to facilitate understanding. This survey encompasses different approaches of feature selection which can be categorized to five domains including: A) subspace learning which involves matrix factorization and matrix projection, B) sparse representation learning which includes compressed sensing and dictionary learning, C) information theory which covers multi-label neighborhood entropy, symmetrical uncertainty, Monte Carlo and Markov blanket, D) evolutionary computational algorithms including Genetic algorithm (GA), particle swarm optimization (PSO), Ant colony (AC) and Grey wolf optimization (GWO), and E) reinforcement learning techniques. This survey can be helpful for researchers to acquire deep understanding of feature selection techniques and choose a proper feature selection technique. Moreover, researcher can choose one of the A, B, C, D and E domains to become deep in this field for future study. A potential avenue for future research could involve exploring methods to reduce computational complexity while simultaneously maintaining performance efficiency. This would involve investigating ways to achieve a more efficient balance between computational resources and overall performance. For matrix-based techniques, the main limitation of these techniques lies in the need to tune the coefficients of the regularization terms, as this process can be challenging and time-consuming. For evolutionary computational techniques, getting stuck in local minimum and finding an appropriate objective function are two main limitations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.