Abstract

Two classes of new nonlinear conjugate gradient methods are proposed in order to avoid the drawbacks of FR and CD. By induction and contradiction, we prove the sufficient descent properties without any line search and the global convergence with the Wolfe line search. The numerical results for 10 classical unconstrained optimization problems respectively indicate that the proposed methods outperform other methods in terms of the iteration, function and gradient calls, etc. The new methods are effective.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call