Abstract

At present, the conjugate gradient (CG) method of Hager and Zhang (Hager and Zhang, SIAM Journal on Optimization, 16(2005)) is regarded as one of the most effective CG methods for optimization problems. In order to further study the CG method, we develop the Hager and Zhang's CG method and present two modified CG formulas, where the given formulas possess the value information of not only the gradient but also the function. Moreover, the sufficient descent condition will be holden without any line search. The global convergence is established for nonconvex function under suitable conditions. Numerical results show that the proposed methods are competitive to the normal conjugate gradient method.

Highlights

  • Consider the following unconstrained optimization problem min f (x), xn (1.1)where f : n is continuously differentiable

  • Motivated by the above observations, we study whether there exists another quasi-Newtonformula whose approximation for the Hessian of the objective function is not inferior to those of the formula (2.1) or the normal BFGS formula in some sense, which possesses the global convergence and the superlinear convergence for general convex function and its numerical results are competitive to those of other similar methods

  • The global convergence and the superlinear convergence have been established for general convex functions.Numerical results show that this method is interesting.Zhang, Deng, and Chen [54] presented the following quasi-Newton equation: Bk 1sk yk3* yk Ak sk, (2.9)

Read more

Summary

INTRODUCTION

Considering the above suggestion, Gilbert and Nocedal[19] proved that the modified PRP method k max 0, is globally convergent with the WWP line search under the assumption of sufficient descent condition. Some formulas which possess the global convergence property (such as DY k [12]) with the WWP did not perform better than the performances of the PRP method in numerical computation. Is there any nonlinear conjugate gradient formula which possesses the sufficient descent property (1.8) without any line search? (j) The method with the WWP line search rule (or other line search rules) has some strongly convergent properties, at least, the method with the formula and the WWP line search rule (or other line search rules) may generate a sufficient descent direction at each iteration, and converges globally.

Motivations based on BFGS formula
Algorithms
THE SUFFICIENT DESCENT PROPERTY AND THE GLOBAL CONVERGENCE
L d k dk 2
NUMERICAL RESULTS
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call