Abstract

In this article, we try to proposed a new conjugate gradient method for solving unconstrained optimization problems, we focus on conjugate gradient methods applied to the non-linear unconstrained optimization problems, the positive step size is obtained by a line search and the new scalar to the new direction for the conjugate gradient method is derived from the quadratic function and Taylor series and by using quasi newton condition and Newton direction while deriving the new formulae. We also prove that the search direction of the new conjugate gradient method satisfies the sufficient descent and all assumptions of the global convergence property are considered and proved .in order to complete the benefit of our research we should take into account studied the numerical results which are written in FORTRAN language when the objective function is compared our new algorithm with HS and PRP methods on the similar set of unconstrained optimization test problems which is very efficient and encouragement numerical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call