Abstract

Following recent attempts to find appropriate choices for parameter of the nonlinear conjugate gradient method proposed by Dai and Liao, two adaptive versions of the method are proposed based on a matrix analysis and using the memoryless BFGS updating formula. Under proper conditions, it is shown that the methods are globally convergent. Numerical experiments are done on a set of CUTEr unconstrained optimization test problems; they demonstrate the efficiency of the proposed methods in the sense of Dolan–More performance profile.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call