Abstract

The conjugate gradient (CG) techniques are a class of unconstrained optimization algorithms with strong local and global convergence qualities and minimal memory needs. While the quasi-Newton methods are reliable and efficient on a wide range of problems and these methods are converge faster than the conjugate gradient methods and require fewer function evaluations, however, they are request substantially more storage, and if the problem is ill-conditioned, they may require several iterations. There is another class, termed preconditioned conjugate gradient method, it is a technique that combines two methods conjugate gradient with quasi-Newton. In this work, we proposed a new two limited memory preconditioned conjugate gradient methods (New1 and New2), to solve nonlinear unconstrained minimization problems, by using new modified symmetric rank one (NMSR1) and new modified Davidon, Fletcher, Powell (NMDFP), and also using projected vectors. We proved that these modifications fulfill some conditions. Also, the descent condition of the new technique has been proved. The numerical results showed the efficiency of the proposed new algorithms compared with some standard nonlinear, unconstrained problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call