Abstract

Presenting a satisfactory and efficient training algorithm for artificial neural networks (ANN) has been a challenging task. The Gravitational Search Algorithm (GSA) is a novel heuristic algorithm based on the law of gravity and mass interactions. Like most other heuristic algorithms, this algorithm has a good ability to search for the global optimum, but suffers from slow searching speed. On the contrary, the Back-Propagation (BP) algorithmcan achieve a faster convergent speed around the global optimum. In this study, a hybrid of GSA and BP is proposed to make use of the advantage of both the GSA and BP algorithms. The proposed hybrid algorithm is employed as a new training method for feedforward neural networks (FNNs). To investigate the performance of the proposed approach, two benchmark problems are used and the results are compared with those obtained from FNNs trained by original GSA and BP algorithms. The experimental results show that the proposed hybrid algorithm outperforms both GSA and BP in training FNNs.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.