Abstract

The conjugate gradient and Quasi-Newton methods have advantages and drawbacks, as although quasi-Newton algorithm has more rapid convergence than conjugate gradient, they require more storage compared to conjugate gradient algorithms. In 1976, Buckley designed a method that combines the CG method with QN updates, which is better than that observed for conjugate gradient algorithms but not as good as the quasi-Newton approach. This type of method is called the preconditioned conjugate gradient (PCG) method. In this paper, we introduce two new preconditioned conjugate gradient (PCG) methods that combine conjugate gradient with a new update of quasi-Newton methods. The new quasi-Newton method satisfied the positive define, and the direction of the new preconditioned conjugate gradient is descent direction. In numerical results, it is showing the new preconditioned conjugate gradient method is more effective on several high-dimension test problems than standard preconditioning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call