Abstract

Feature selection commonly refers to a process of using the candidate algorithm to detect the optimal feature sets during the preprocessing steps in machine learning and data mining. This procedure can optimize the dataset's features to be analyzed and maximize the classification performance based on the selected optimal feature combination. In this work, a hybridization model is developed and utilized to select optimal feature subset based on an innovative binary version of moth-flame optimizer and the K-Nearest Neighbor Classifier (KNN) for classification tasks. The proposed new technique, abbreviated as MFeature or ESAMFO, applies several strategies, including two types of transfer functions, ensemble strategy, simulated annealing (SA) disturbance mechanism, and crossover scheme to improve the equilibrium between the global exploration and local exploitation capabilities of the basic MFO. Each individual in the proposed algorithm is evaluated by the size of selected features and the error rate of the KNN classifier. The proposed model's efficacy is assessed on 30 benchmark datasets with different dimensions from the UCI repository and compared with other KNN based feature selection algorithms from the literature. The comprehensive results via various comparisons reveal the efficiency of the proposed technique in decreasing the classification error rate compared with other feature selection algorithms, ensuring the capability of ESAMFO in exploring the feature space and selecting the most informative features for classification purposes. For post publications that support this research, readers can refer to https://aliasgharheidari.com.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call