Abstract

In data mining and machine learning, feature selection is a critical part of the process of selecting the optimal subset of features based on the target data. There are 2n potential feature subsets for every n features in a dataset, making it difficult to pick the best set of features using standard approaches. Consequently, in this research, a new metaheuristics-based feature selection technique based on an adaptive squirrel search optimization algorithm (ASSOA) has been proposed. When using metaheuristics to pick features, it is common for the selection of features to vary across runs, which can lead to instability. Because of this, we used the adaptive squirrel search to balance exploration and exploitation duties more evenly in the optimization process. For the selection of the best subset of features, we recommend using the binary ASSOA search strategy we developed before. According to the suggested approach, the number of features picked is reduced while maximizing classification accuracy. A ten-feature dataset from the University of California, Irvine (UCI) repository was used to test the proposed method’s performance vs. eleven other state-of-the-art approaches, including binary grey wolf optimization (bGWO), binary hybrid grey wolf and particle swarm optimization (bGWO-PSO), bPSO, binary stochastic fractal search (bSFS), binary whale optimization algorithm (bWOA), binary modified grey wolf optimization (bMGWO), binary multiverse optimization (bMVO), binary bowerbird optimization (bSBO), binary hybrid GWO and genetic algorithm (bGWO-GA), binary firefly algorithm (bFA), and bGA methods. Experimental results confirm the superiority and effectiveness of the proposed algorithm for solving the problem of feature selection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call