Abstract

Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: (1) The Fletcher–Reeves method, the Hestenes–Stiefel method, and the Dai–Yuan method applied to a strongly convex LC 1 objective function; (2) The Polak–Ribiere method and the Conjugate Descent method applied to a general, not necessarily convex, LC 1 objective function.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call