Abstract
The spectral conjugate gradient algorithm, which is a variant of conjugate gradient method, is one of the effective methods for solving unconstrained optimization problems. In this paper, based on Hestenes–Stiefel method, two new spectral conjugate gradient algorithms (Descend Hestenes-Stiefel (DHS) and Wang-Hestenes-Stiefel (WHS)) are proposed. Under Wolfe line search and mild assumptions on objective function, the two algorithms possess sufficient descent property without any other conditions and are always globally convergent. Numerical results turn out the new algorithms outperform Hestenes–Stiefel conjugate gradient method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Algorithms & Computational Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.