Abstract

The classification accuracy of a multi-layer perceptron (MLP) depends on the selection of relevant features from the data set, its architecture, connection weights and the transfer functions. Generating an optimal value of all these parameters together is a complex task. Metaheuristic algorithms are popular choice among researchers to solve complex optimization problems. This paper presents a hybrid metaheuristic algorithm simple matching-grasshopper new cat swarm optimization algorithm (SM-GNCSOA) that optimizes all the four components simultaneously. SM-GNCSOA uses grasshopper optimization algorithm, a new variant of binary grasshopper optimization algorithm called simple matching-binary grasshopper optimization algorithm and a new variant of cat swarm optimization algorithm called new cat swarm optimization algorithm to generate an optimal MLP. Features play a vital role in determining the classification accuracy of a classifier. Here, we propose a new feature penalty function and use it in SM-GNCSOA to prevent underfitting or overfitting due to the selected number of features. To evaluate the performance of SM-GNCSOA, different variants of SM-GNCSOA are proposed and their classification accuracies are compared with SM-GNCSOA on ten classification data sets. The results show that SM-GNCSOA gives better results on most of the data sets due to its capability to balance exploration and exploitation and to avoid local minima.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call