Abstract

<p><span>The conjugate gradient methods are one of the most important techniques used to address problems involving minimization or maximization, especially nonlinear optimization problems with no constraints at all. That is because of their simplicity and low memory needed. They can be applied in many areas, such as economics, engineering, neural networks, image restoration, machine learning, and deep learning. The convergence of Fletcher-Reeves (FR) conjugate gradient method has been established under both exact and strong Wolfe line searches. However, it is performance in practice is poor. In this paper, to get good numerical performance from the FR method, a little modification is done. The global convergence of the modified version has been established for general nonlinear functions. Preliminary numerical results show that the modified method is very efficient in terms of number of iterations and CPU time.</span></p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call