Abstract

Iterative methods such as the conjugate gradient method are well known methods for solving non-linear unconstrained minimization problems partially because of their capacity to handle large-scale unconstrained optimization problems rapidly, and partly due to their algebraic representation and implementation in computer programs. The conjugate gradient method has wide applications in a lot of fields such as machine learning, neural networks and many other fields. Fletcher and Reeves [1] expanded the approach to nonlinear problems in 1964. It is considered to be the first nonlinear conjugate gradient technique. Since then, lots of new other conjugate gradient methods have been proposed. In this work, we will propose a new coefficient conjugate gradient method to find the minimum of the non-linear unconstrained optimization problems based on parameter of Hestenes Stiefel. Section one in this work contains the derivative of new method. In section two, we will satisfy the descent and sufficient descent conditions. In section three, we will study the property of the global convergence of the new proposed. In the fourth section, we will give some numerical results by using some known test functions and compare the new method with Hestenes S. to demonstrate the effectiveness of the suggestion method. Finally, we will give conclusions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call