Abstract

Feature selection (FS) is recognized as one of the majority public and challenging problems in the Machine Learning domain. FS can be examined as an optimization problem that needs an effective optimizer to determine its optimal subset of more informative features. This paper proposes a wrapper FS method that combines chaotic maps (CMs) and binary Group Search Optimizer (GSO) called CGSO, which is used to solve the FS problem. In this method, five chaotic maps are incorporated with the GSO algorithm’s main procedures, namely, Logistic, Piecewise, Singer, Sinusoidal, and Tent. The GSO algorithm is used as a search strategy, while k-NN is employed as an induction algorithm. The objective function is to integrate three main objectives: maximizing the classification accuracy value, minimizing the number of selected features, and minimizing the complexity of generated k-NN models. To evaluate the proposed methods’ performance, twenty well-known UCI datasets are used and compared with other well-known published methods in the literature. The obtained results reveal the superiority of the proposed methods in outperforming other well-known methods, especially when using binary GSO with Tent CM. Finally, it is a beneficial method to be utilized in systems that require FS pre-processing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.