Abstract

Backpropagation (BP) learning algorithm is the most widely used supervised learning technique that is extensively applied in the training of multi-layer feed-forward neural networks. Although many modifications of BP have been proposed to speed up the learning of the original BP, they seldom address the local minimum and the flat-spot problem. This paper proposes a new algorithm called Local-minimum and Flat-spot Problem Solver (LFPS) to solve these two problems. It uses a systematic approach to check whether a learning process is trapped by a local minimum or a flat-spot area, and then escape from it. Thus, a learning process using LFPS can keep finding an appropriate way to converge to the global minimum. The performance investigation shows that the proposed algorithm always converges in different learning problems (applications) whereas other popular fast learning algorithms sometimes give very poor global convergence capabilities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.