Abstract

Feature selection aims to select a small subset of relevant features while maintaining or even improving the classification performance over using all features. Feature selection can be considered as a multi-objective problem, i.e., minimizing the number of selected features and maximizing the classification accuracy (minimizing the classification error) simultaneously. Most evolutionary multi-objective algorithms encounter difficulties when handling a feature selection task due to the discrete search space, although they perform well on continuous/numeric optimization problems. This paper proposes a grid-dominance based multi-objective evolutionary algorithm to address feature selection. The aim is to explore the potential of the grid-dominance method to strengthen the selection pressure toward the optimal direction while maintaining an extensive distribution among the objective values of feature subsets. To increase the population diversity, a subset filtration mechanism is proposed. The performance of the proposed two algorithms is tested on fourteen datasets of varying difficulty. With the proposed methods, the performance metrics, hypervolume and inverted generational distance have been significantly improved compared with other commonly used multi-objective algorithms, and the population diversity has also been increased.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.