Abstract

This paper proposes a comparative study of commonly-used global optimization methods to improve training performance of back-propagation neural networks. The optimization methods adopted herein include Simulated annealing, Direct search, and Genetic algorithm. These methods are used to optimize neural networks' weights and biases before using back-propagation algorithm in order to prevent the networks from local minima. Four benchmark datasets of prediction (regression) task were used to evaluate the established models. The experimental results indicated that optimizing neural network's parameters is a complicated problem due to its high dimension of variables to be optimized. And only genetic algorithm was able to solve this difficult optimization problem. In addition, this paper also applied this success method to predict monthly rainfall time series data in the northeast region of Thailand. The results indicated that using of genetic algorithm with back-propagation neural network is a recommended combination.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.