Abstract

Conjugate gradient methods constitute excellent neural network training methods which are characterized by their simplicity and their very low memory requirements. In this paper, we propose a new spectral conjugate gradient method which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, we establish the global convergence of our proposed method under some assumptions. Experimental results provide evidence that our proposed method is preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call