Abstract
In this article, we consider the global convergence of the Polak–Ribiére–Polyak conjugate gradient method (abbreviated PRP method) for minimizing functions that have Lipschitz continuous partial derivatives. A novel form of non-monotone line search is proposed to guarantee the global convergence of the PRP method. It is also shown that the PRP method has linear convergence rate under some mild conditions when the non-monotone line search reduces to a related monotone line search. The new non-monotone line search needs to estimate the Lipschitz constant of the gradients of objective functions, for which two practical estimations are proposed to help us to find a suitable initial step size for the PRP method. Numerical results show that the new line search approach is efficient in practical computation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.