Abstract

Background: Feature selection sometimes also known as attribute subset selection is a process in which optimal subset of features are elected with respect to target data by reducing dimensionality and removing irrelevant features. There will be 2n possible solutions for a dataset having n number of features that is difficult to solve by conventional attribute selection method. In such cases metaheuristic-based methods generally outruns the conventional methods. Objective: The main aim of this paper is to enhance the classification accuracy and minimize the number of selected features and error rate. Methods: To achieve the objective, a binary metaheuristic feature selection method bGWOSA based on grey wolf optimization and simulated annealing has been introduced. The proposed feature selection method uses simulated annealing for equalizing the trade-off between exploration and exploitation. The performance of the proposed binary feature selection method has been examined on the ten feature selection benchmark datasets taken from UCI repository and compared with binary cuckoo search, binary particle swarm optimization, binary grey wolf optimization, binary bat algorithm and binary hybrid whale optimization method. Results: The proposed feature selection method achieves the highest accuracy for the most of datasets compared to state-of-the-art. Further, from the experimental and statistical results, efficacy of the proposed feature selection method has been validated. Conclusion: Classification accuracy can be enhanced by employing feature selection methods. Moreover, performance can also be enhanced by tuning the control parameters of metaheuristic methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call