Feature Selection (FS) is considered a crucial step in machine learning and data mining tasks, which facilitates minimizing the direct consequence of redundant and irrelevant attributes on the model’s accuracy. Hence, the researchers developed various algorithms to choose the most appropriate features for improving the accuracy rate of the presented dataset. Nevertheless, these algorithms may fall into local optima problems when applied to substantial feature sizes’ datasets. In this paper, for handling the FS strategy through dimensionality-lessening while improving the classification accuracy, an effective Aquila Optimization (AO) algorithm is introduced. AO has stable exploration and exploitation capabilities. It is enhanced by integrating a random position amendment approach with Local Search (LS) strategy to avoid local optima and then cloned into a binary version called Improved Binary AO (IBAO). Additionally, k-Nearest Neighbor (k-NN) and Support Vector Machine (SVM) are quality estimators. On 18 multi-scale benchmarks, the IBAO algorithm is compared with the original BAO algorithm and twelve recent algorithms, such as Binary Artificial Bee Colony (BABC), Binary Bat Algorithm (BBA), Binary Particle Swarm Optimization (BPSO), Binary Whale Optimization Algorithm (BWOA), Binary Grey Wolf Optimization (BGWO), Binary Sailfish Optimizer (BSFO), Binary Henry Gas Solubility Optimization (BHGSO), Binary Harris Hawks Optimization (BHHO). According to Wilcoxon’s rank-sum test (α=0.05), the dominance and significant influence of IBAO are apparent on both small and large dimensional benchmarks and obtained classification accuracy up to 100% in some of these benchmarks integrated with a feature reduction size down to 92%.
Read full abstract