Abstract

Neural network learning algorithms based on conjugate gradient techniques and quasi Newton techniques such as Broyden, DFP, BFGS, and SSVM algorithms require exact or inexact line searches in order to satisfy their convergence criteria. Line searches are very costly and slow down the learning process. This paper presents new neural network learning algorithms based on Hoshino's weak line search technique and Davidon's optimally conditioned line search free technique. Also, a practical method of using these optimization algorithms is presented such that they will avoid getting trapped in local minima for the most part. The global minimization problem is a serious one when quadratically convergent techniques such as quasi Newton methods are used. Furthermore, to display the performance of the proposed learning algorithms, the more practical algorithm based on Davidon's minimization technique is used in conjunction with a cursive handwriting recognition problem. For comparison with other algorithms, also a few small benchmark tests are conducted and reported. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call