Abstract

Conditions of configuring feedforward neural networks without local minima are analyzed for both synchronous and asynchronous learning rules. Based on the analysis, a learning algorithm that integrates a synchronous-asynchronous learning rule with a dynamic configuration rule to train feedforward neural networks is presented. The theoretic analysis and numerical simulation reveal that the proposed learning algorithm substantially reduces the likelihood of local minimum solutions in supervised learning.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call