Abstract
The nonlinear conjugate gradient (CG) algorithm is a very effective method for optimization, especially for large-scale problems, because of its low memory requirement and simplicity. Zhang et al. (IMA J. Numer. Anal. 26:629-649, 2006) firstly propose a three-term CG algorithm based on the well known Polak-Ribière-Polyak (PRP) formula for unconstrained optimization, where their method has the sufficient descent property without any line search technique. They proved the global convergence of the Armijo line search but this fails for the Wolfe line search technique. Inspired by their method, we will make a further study and give a modified three-term PRP CG algorithm. The presented method possesses the following features: (1) The sufficient descent property also holds without any line search technique; (2) the trust region property of the search direction is automatically satisfied; (3) the steplengh is bounded from below; (4) the global convergence will be established under the Wolfe line search. Numerical results show that the new algorithm is more effective than that of the normal method.
Highlights
We consider the optimization models defined by min f (x), ( . )x∈ n where the function f : n → is continuously differentiable
The PRP method is very efficient as regards numerical performance, but it fails as regards the global convergence for the general functions under Wolfe line search technique and this is a still open problem; many scholars want to solve it
It is worth noting that a recent work of Yuan et al [ ] proved the global convergence of PRP method under a modified Wolfe line search technique for general functions
Summary
X∈ n where the function f : n → is continuously differentiable. There exist many similar professional fields of science that can revert to the above optimization models (see, e.g., [ – ]). In order to overcome this drawback, we will propose a modified three-term PRP formula that will have the sufficient descent property and the trust region feature. The sufficient descent property, the trust region feature, and the global convergence. It has been proved that, even for the function f (x) = λ x (λ > is a constant) and the strong Wolfe conditions, the PRP conjugate gradient method may not yield a descent direction for an unsuitable choice (see [ ] for details). An interesting feature of the new three-term CG method is that the given search direction is sufficiently descent.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.