Abstract

Recently, several three-term conjugate gradient (CG) methods for solving unconstrained optimization problems have been proposed. Most of the methods centred on improving the convergence of the classical PRP-CG method while retaining its excellent numerical performance. Generally the PRP method is not convergent, because it failed to satisfy the sufficient descent property, especially, under modified Armijo line search or Wolfe line search method. In this paper, we propose an efficient three-term conjugate gradient method by utilizing the modified PRP formula which satisfies both the sufficient descent and the global convergence properties under the Wolfe line search. In particular, a new conjugate parameter is constructed. This parameter retains the numerator of PRP method and constructs an acceleration model in the denominator. The new denominator is designed to enhance the reduction in the number of iteration, CPU time as well as the convergence. Numerical results generated using various standard unconstrained optimization problem shows that the proposed method is promising and demonstrates a better numerical performance in comparison with other well-known CG methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call