Abstract

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. This paper proposes a conjugate gradient method which is similar to Dai-Liao conjugate gradient method (Dai and Liao, 2001) but has stronger convergence properties. The given method possesses the sufficient descent condition, and is globally convergent under strong Wolfe-Powell (SWP) line search for general function. Our numerical results show that the proposed method is very efficient for the test problems.

Highlights

  • The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization due to the simplicity of their iterations and their very low memory requirements

  • This paper proposes a conjugate gradient method which is similar to Dai-Liao conjugate gradient method (Dai and Liao, 2001) but has stronger convergence properties

  • Motivated by the above ideal, in this paper, we focus on finding the new conjugate gradient method which possesses the following properties: (1) nonnegative property βk ≥ 0; (2) the new formula contains the gradient information and some Hessian information; (3) the search directions dk generated by the proposed method satisfy the sufficient descent conditions (21)

Read more

Summary

Introduction

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization due to the simplicity of their iterations and their very low memory requirements. Powell [3] constructed a counterexample and showed that the PR method and HS method can cycle infinitely without approaching the solution This example suggests that these two methods have a drawback that they are not globally convergent for general functions. Using a new conjugacy condition, Dai and Liao [4] proposed two new methods One of their methods is globally convergent for general functions and performs better than HS and PR methods. Similar to Dai and Liao’s approach, we propose another formula for βk, analyze the convergence properties for the given method, and carry the numerical experiment which shows that the given method is robust and efficient.

Motivations and New Nonlinear Conjugate Gradient Method
Convergence Analysis
Numerical Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call