Abstract

The problem of feature selection involves selecting the most informative subset of features from a data item which have the most impact in the context of classification. It is an optimization problem that requires discarding irrelevant and redundant data without compromising the classification accuracy rate of the learning algorithm. Due to the complexity of the problem, various stochastic methods have been employed to generate promising results. In this work, we propose Normalized Mutual Information-based equilibrium optimizer (NMIEO), a novel variant of the classical Equilibrium Optimizer as a wrapper-filter framework for feature selection. NMIEO incorporates a novel local search strategy based on Normalized Mutual Information to enhance the exploitation capability. Additionally, for improving the diversity of the solutions, Chaotic maps are utilized for better population initialization. The proposed method is evaluated on 14 challenging high-dimensional datasets selected from diverse domains. The results are tallied against eight well-known metaheuristic-based algorithms proposed recently in the literature for solving the feature selection problem. The experimental results and further analysis illustrate the superior performance of the proposed NMIEO relative to other wrapper-based algorithms that are currently in use.Furthermore, the Wilcoxon’s Signed Rank Test demonstrates the statistical superiority of NMIEO over other competitive methods across all selected metrics, confirming the success of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call