Abstract

Feature Selection (FS) is a special preprocessing step in Machine Learning (ML) that reduces the number of unwanted features in datasets to increase the accuracy of ML classifiers. A popular binary variant of the continuous Jaya algorithm is the Discrete Jaya (DJaya) algorithm. It is commonly used for addressing optimization problems with binary design variables (also known as binary decision variables). Nevertheless, DJaya tends to prematurely converge to local optimal solutions, and its performance deteriorates as the complexity of the optimization problem grows. The DJaya algorithm is improved in this article by introducing the Improved Binary DJaya Algorithm (IBJA), which is specially designed to solve the FS problem. IBJA includes three techniques in DJaya. First, it incorporates the update equation of the Harris Hawks Optimization (HHO) algorithm into the optimization loop of DJaya to enhance DJaya’s searching process and exploration abilities. Second, it uses a new Dynamic Opposition-based Learning in the final steps of the optimization loop of DJaya to boost its searching and exploring capabilities. Third, it employs binary transfer functions to calculate binary solutions from the real-valued solutions generated by HHO and DOBL. Using 15 UCI datasets, IBJA’s performance was assessed and compared against four ML classifiers and ten efficient optimization algorithms. Besides, the Friedman statistical test was employed to investigate the reliability of the experimental findings. According to the overall experimental and statistical findings, IBJA scored the highest accuracy, best objective value, and fewest chosen features for each of the 15 UCI datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call