Abstract

AbstractThe training of the feed forward neural network (FFNN) can be formulated as an optimization problem. In this paper, we present a new Harris hawk optimization algorithm (HHO) to minimize the mean square error (MSE). To balance between the global and local search of the traditional HHO algorithm, we invoke the chaotic map into it. The proposed algorithm is named the Chaotic Harris Hawks Optimization (CHHO) algorithm. We apply the CHHO for training the feed-forward neural network (FFNN). To verify the efficiency of the CHHO algorithm, we test it on five classification datasets and compare it against eight meta-heuristics algorithms in the literature. The experimental results show that the proposed CHHO algorithm has the best overall performance and has more outstanding performance than other meta-heuristic algorithms in terms of performance metrics.KeywordsHarris hawks optimizationFeed-forward neural networkSwarm intelligence algorithmsChaotic mapsGlobal optimization

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call