Abstract

It is generally acknowledged that the conjugate gradient (CG) method achieves global convergence—with at most a linear convergence rate—because CG formulas are generated by linear approximations of the objective functions. The quadratically convergent results are very limited. We introduce a new PRP method in which the restart strategy is also used. Moreover, the method we developed includes not only n-step quadratic convergence but also both the function value information and gradient value information. In this paper, we will show that the new PRP method (with either the Armijo line search or the Wolfe line search) is both linearly and quadratically convergent. The numerical experiments demonstrate that the new PRP algorithm is competitive with the normal CG method.

Highlights

  • Consider min f ðxÞ; x2

  • Li and Tian [34] proved that the modifying the PRP (MPRP) method with a restart strategy exhibited quadratic convergence under certain inexact line searches and suitable assumptions

  • By applying the conclusion of [34], we prove that the new conjugate gradient (CG) method obtains global convergence for general functions and n-step quadratic convergence for uniformly convex functions

Read more

Summary

Introduction

Consider min f ðxÞ; x2

Motivation and Algorithm
Numerical Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.