Abstract
Conjugate gradient method is an effective method for solving large-scale unconstrained optimization problems. This paper proposes a new conjugate gradient algorithm based on the self-scaling memoryless BFGS update, which uses the Wolfe line search. The descent and global convergence of the method are given under mild conditions. Numerical experiments show that our new conjugate gradient method is effective.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have