Abstract

In this article, we try to proposed a new conjugate gradient method for solving unconstrained optimization problems, we focus on conjugate gradient methods applied to the non-linear unconstrained optimization problems, the positive step size is obtained by a line search and the new scalar to the new direction for the conjugate gradient method is derived from the quadratic function and Taylor series and by using quasi newton condition and Newton direction while deriving the new formulae. We also prove that the search direction of the new conjugate gradient method satisfies the sufficient descent and all assumptions of the global convergence property are considered and proved .in order to complete the benefit of our research we should take into account studied the numerical results which are written in FORTRAN language when the objective function is compared our new algorithm with HS and PRP methods on the similar set of unconstrained optimization test problems which is very efficient and encouragement numerical results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.