Abstract

Nonlinear conjugate gradient methods are widely used in solving large scale unconstrained optimization. Their wide application in many fields are due to their low memory. Numerous studies have been conducted recently to improve these methods. In this paper, a new class of conjugate gradient coefficients that possess global convergence properties is proposed. The global convergence result using exact line searches are discussed. Numerical result shows that the proposed method is more efficient compared with the other classical conjugate gradient.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call