Abstract

In classification problems, normally there exists a large number of features, but not all of them contributing to the improvement of classification performance. These redundant features make the classification problem time consuming and often result in poor performance. Feature selection methods have been proposed to reduce the number of features, minimize computational cost, and maximize classification accuracy. As a wrapper-based approach, evolutionary algorithms have been widely applied in feature subset selection tasks. However, some of them trap into local minima, especially when the number of features increases, while others are not efficient in computational time. This paper proposes a Modified Differential Evolution (DE) approach to Feature Selection (MDEFS) by utilizing two new mutation strategies to create a feasible balance between exploration and exploitation and maintain the classification performance in an acceptable range concerning both the number of features and accuracy. Some modifications are made to the standard DE crossover and its key control parameters to enhance the proposed algorithm’s capabilities further. The proposed method has been compared to several state-of-the-art methods utilizing standard datasets from the UCI repository and results of the experiments demonstrate the superiority of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call