Abstract

This paper introduces a novel PSO-GA based hybrid training algorithm with Adam Optimization and contrasts performance with the generic Gradient Descent based Backpropagation algorithm with Adam Optimization for training Artificial Neural Networks. We aim to overcome the shortcomings of the traditional algorithm, such as slower convergence rate and frequent convergence to local minima, by employing the characteristics of evolutionary algorithms. PSO has a property of faster convergence rate, which can be exploited to account for the slower pace of convergence of the traditional BP (which is due to low values of gradients). In contrast, the integration with GA complements the drawback of convergence to local minima as GA, possesses the capability of efficient global search. So by this integration of these algorithms, we propose our new hybrid algorithm for training ANNs. We compare both the algorithms for the application of medical diagnosis. Results display that the proposed hybrid training algorithm, significantly outperforms the traditional training algorithm, by enhancing the accuracies of the ANNs with an increase of 20% in the average testing accuracy and 0.7% increase in the best testing accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.