Abstract

The spectral conjugate gradient algorithm, which is a variant of conjugate gradient method, is one of the effective methods for solving unconstrained optimization problems. In this paper, based on Hestenes–Stiefel method, two new spectral conjugate gradient algorithms (Descend Hestenes-Stiefel (DHS) and Wang-Hestenes-Stiefel (WHS)) are proposed. Under Wolfe line search and mild assumptions on objective function, the two algorithms possess sufficient descent property without any other conditions and are always globally convergent. Numerical results turn out the new algorithms outperform Hestenes–Stiefel conjugate gradient method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call