Abstract

In this paper, we have discussed and investigated two nonlinear extended PR-CG method which use function and gradient values. The two new methods involve the standard CG-methods and have the sufficient descent and globally convergence properties under certain conditions. We have got some important numerical results by comparing the new method with Wu and Chen PRCG-(2010) method in this field.

Highlights

  • IntroductionThis paper considers the calculation of a local minimizer x* say, for the problem: Min f (x) ; where f : R n → R ......(1)

  • This paper considers the calculation of a local minimizer x* say, for the problem: Min f (x) ; where f : R n → R ......(1)is a smooth nonlinear function and its gradient vector gk = f is available are calculated but the Hessian matrix is not available

  • Numerical Results The main work of this section is to report the performance of the new method on a set of test problems

Read more

Summary

Introduction

This paper considers the calculation of a local minimizer x* say, for the problem: Min f (x) ; where f : R n → R ......(1). Is a smooth nonlinear function (of n variables) and its gradient vector gk = f (xk ) is available are calculated but the Hessian matrix is not available. At the current iterative point xk , the Conjugate Gradient (CG) method has the following form: Two New Extended PR Conjugate Gradient Methods for Solving Nonlinear. (2b) where k is a step-length; dk is a search direction; k is a parameter

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call