Abstract

Feature selection (FS) is used to solve hard optimization problems in artificial intelligence and data mining. In the FS process, some, rather than all of the features of a dataset are selected in order to both maximize classification accuracy and minimize the time required for computation. In this paper a FS wrapper method that uses K-nearest Neighbor (KSN) classification is subjected to two modifications using a current improvement algorithm, the Monarch Butterfly Optimization (MBO) algorithm. The first modification, named MBOICO, involves the utilization of an enhanced crossover operator to improve FS. The second, named MBOLF, integrates the Levy flight distribution into the MBO to improve convergence speed. Experiments are carried out on 25 benchmark data sets using the original MBO, MBOICO and MBOLF. The results show that MBOICO is superior, so its performance is also compared against that of four metaheuristic algorithms (PSO, ALO, WOASAT, and GA). The results indicate that it has a high classification accuracy rate of 93% on average for all datasets and significantly reduces the selection size. Hence, the findings demonstrate that the MBOICO outperforms the other algorithms in terms of classification accuracy and number of features chosen (selection size).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call