Abstract

This article presents a new conjugate gradient (CG) method that requires first-order derivatives but overcomes the slow convergence issue associated with the steepest descent method and does not require the computation of second-order derivatives, as needed in the Newton method. The CG update parameter is suggested from the extended conjugacy condition as a convex combination of Polak, Ribiére, and Polyak (PRP) and Dai and Yuan (DY) algorithms by employing the optimal choice of the modulating parameter 't'. Numerical computations show that the algorithm is robust and efficient based on the number of iterations and CPU time. The scheme converges globally under Wolfe line search and adopts an inexact line search to obtain the step-size that generates a descent property, without requiring exalt computation of the step size. Conjugate gradient method, Descent property, Dai-Liao conjugacy condition, Global convergence, Numerical methods

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call