Abstract

This paper describes an efficient second-order algorithm for learning of the multilayer neural networks with widely and stable convergent properties. First, the algorithm based on iterative formula of the steepest descent method, which is implicitly employed, is introduced. We show the equivalent property between the Gauss-Newton(GN) method and the implicit steepest descent (ISD) method. This means that ISD method satisfy the desired targets by simultaneously combining the merits of the GN and SD techniques in order to enhance the very good properties of SD method. Next, we propose very powerful algorithm for learning multilayer feedforward neural networks, called implicit steepest descent with momentum (ISDM) method and show the analogy with the trapezoidal formula in the field of numerical analysis. Finally, the proposed algorithms are compared with GN method for training multilayer neural networks through the computer simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call