Abstract

A Global Convergence of Spectral Conjugate Gradient Method for Large Scale Optimization

Highlights

  • Let f : Rn → R be continuously differentiable function

  • A conjugate gradient (CG) method generates a sequence of iterates by letting xk = xk −1 + k −1dk −1, k=0,1,2

  • In this paper we proposed two spectral CG method, they based to the modification to the standard conjugate descent (CD) in (4), and proposed a suitable k for each one to get a good spectral CD-CG methods

Read more

Summary

Introduction

Let f : Rn → R be continuously differentiable function. Consider the unconstrained nonlinear optimization problem: Minimize f(x), x Rn. We use g(x) to denote to the gradient of f at x. Due to need less computer memory especially, conjugate gradient method is very appealing for solving (1) when the number of variables is large. A conjugate gradient (CG) method generates a sequence of iterates by letting xk = xk −1 + k −1dk −1, k=0,1,2,. Where k is scalar which determines the different CG methods [11]. In survey paper Hager and Zhang in [9] reviewed the development of different various of nonlinear gradient methods, with especial attention given to global convergence properties

CD k by
FR formula and d
We are going to prove that
MCD k
NOF NOI
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call