Abstract

In application to general function, each of the conjugate gradient and Quasi-Newton methods has particular advantages and disadvantages. Conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. Quasi-Newton methods are reliable and efficient on a wide range of problems and they converge faster than the conjugate gradient method and require fewer function evaluations but they have the disadvantage of requiring substantially more storage and if the problem is ill-conditioned, they may take several iterations. A new class has been developed, termed preconditioned conjugate gradient (PCG) method. It is a method that combines two methods, conjugate gradient and Quasi-Newton. In this work, two new preconditioned conjugate gradient algorithms are proposed namely New PCG1 and New PCG2 to solve nonlinear unconstrained optimization problems. A new PCG1 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling symmetric Rank one (SR1), and a new PCG2 combines conjugate gradient method Hestenes-Stiefel (HS) with new self-scaling Davidon, Flecher and Powell (DFP). The algorithm uses the strong Wolfe line search condition. Numerical comparisons with standard preconditioned conjugate gradient algorithms show that for these new algorithms, computational scheme outperforms the preconditioned conjugate gradient.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call