Abstract

The three-term conjugate gradient (CG) algorithms are among the efficient variants of CG method for convex and nonconvex functions. This is because most three-term algorithms are constructed using the classical CG method whose numerical performance has been tested and convergence proved. In this paper, we present a modification of RMIL$+$ CG method proposed by Dai [Z. Dai, Appl. Math. Comput., \(\bf 267\) (2016), 297--300] based on the convergence analysis of RMIL (2012) CG method. Interestingly, the modified method possesses sufficient descent condition and the global convergence prove was established using exact minimization condition. We further extended the results of the modified RMIL$+$ to construct a three-term CG algorithm and also show that the method satisfies the sufficient descent condition under the strong Wolfe line search. Preliminary numerical results are reported based on known benchmark problems which show that the proposed methods are efficient and promising compare to other CG methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call