Abstract

The Spectral conjugate gradient method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we propose a new spectral conjugate gradient method in which performance is analyzed numerically. We establish the descent condition and global convergence property under some assumptions and the strong Wolfe line search. Numerical experiments to evaluate the method’s efficiency are conducted using 98 problems with various dimensions and initial points. The numerical results based on the number of iterations and central processing unit time show that the new method has a high performance computational.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call