Abstract

Feature selection, an optimization problem, becomes an important pre-process tool in data mining, which simultaneously aims at minimizing feature-size and maximizing model generalization. Because of large search space, conventional optimization methods often fail to generate global optimum solution. A variety of hybrid techniques merging different search strategies have been proposed in feature selection literature, but mostly deal with low dimensional datasets. In this paper, a hybrid optimization method is proposed for numerical optimization and feature selection, which integrates sine-cosine algorithm (SCA) in Harris hawks optimization (HHO). The goal of SCA integration is to cater ineffective exploration in HHO, moreover exploitation is enhanced by dynamically adjusting candidate solutions for avoiding solution stagnancy in HHO. The proposed method, namely SCHHO, is evaluated by employing CEC’17 test suite for numerical optimization and sixteen datasets with low and high-dimensions exceeding 15000 attributes, and compared with original SCA and HHO, as well as, other well-known optimization methods like dragonfly algorithm (DA), whale optimization algorithm (WOA), grasshopper optimization algorithm (GOA), Grey wolf optimization (GWO), and salp swarm algorithm (SSA); in addition to state-of-the-art methods. Performance of the proposed method is also validated against hybrid methods proposed in recent related literature. The extensive experimental and statistical analyses suggest that the proposed hybrid variant of HHO is able to produce efficient search results without additional computational cost. With increased convergence speed, SCHHO reduced feature-size up to 87% and achieved accuracy up to 92%. Motivated from the findings of this study, various potential future directions are also highlighted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call