Abstract

A new efficient computational technique for training of multilayer feedforward neural networks is proposed. The proposed algorithm consists two learning phases. The first phase is a local search which implements gradient descent, and the second phase is a direct search scheme which implements dynamic tunneling in weight space avoiding the local trap thereby generates the point of next descent. The repeated application of these two phases alternately forms a new training procedure which results into a global minimum point from any arbitrary initial choice in the weight space. The simulation results are provided for five test examples to demonstrate the efficiency of the proposed method which overcomes the problem of initialization and local minimum point in multilayer perceptrons.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.