Abstract
Training of feed-forward neural networks is a well-known and important hard optimization problem, frequently used for classification purpose. Swarm intelligence metaheuristics have been successfully used for such optimization problems. In this chapter we present how cuckoo search and bat algorithm, as well as the modified version of the bat algorithm, were adjusted and applied to the training of feed-forward neural networks. We used these three algorithms to search for the optimal synaptic weights of the neural network in order to minimize the function errors. The testing was done on four well-known benchmark classification problems. Since the number of neurons in hidden layers may strongly influence the performance of artificial neural networks, we considered several neural networks architectures for different number of neurons in the hidden layers. Results show that the performance of the cuckoo search and bat algorithms is comparable to other state-of-the-art nondeterministic optimization algorithms, with some advantage of the cuckoo search. However, modified bat algorithm outperformed all other algorithms which shows great potential of this recent swarm intelligence algorithm.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.