Abstract

The hybrid conjugate gradient parameters are among the efficient variants of conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. This is due to their nice convergence properties and low memory requirements. In this paper, we present a new hybrid conjugate gradient method based on famous CG algorithms for large-scale unconstrained optimization. The proposed hybrid CG method can generate a descent search direction at each iteration provided the strong Wolfe line search is employed. Numerical results have been presented which show that the proposed method is efficient and promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call