Abstract

In this paper, we first propose a new three-term conjugate gradient (CG) method, which is based on the least-squares technique, to determine the CG parameter, named LSTT. And then, we present two improved variants of the LSTT CG method, aiming to obtain the global convergence property for general nonlinear functions. The least-squares technique used here well combines the advantages of two existing efficient CG methods. The search directions produced by the proposed three methods are sufficient descent directions independent of any line search procedure. Moreover, with the Wolfe–Powell line search, LSTT is proved to be globally convergent for uniformly convex functions, and the two improved variants are globally convergent for general nonlinear functions. Preliminary numerical results are reported to illustrate that our methods are efficient and have advantages over two famous three-term CG methods.

Highlights

  • Consider the following unconstrained optimization problem: min f (x), x∈Rn where f : Rn → R is a continuously differentiable function whose gradient function is denoted by g(x).Conjugate gradient (CG) methods are known to be among the most efficient methods for unconstrained optimization due to their advantages of simple structure, low storage, and nice numerical behavior

  • The following lemma shows that the direction dkLSTT (11) is a sufficient descent direction, which is independent of the line search used

  • Lemma 3 Suppose that the sequence {dk} of directions is generated by Algorithm 2, and that the stepsize αk is calculated by the Wolfe–Powell line search (1) and (2)

Read more

Summary

Introduction

Consider the following unconstrained optimization problem: min f (x), x∈Rn where f : Rn → R is a continuously differentiable function whose gradient function is denoted by g(x).Conjugate gradient (CG) methods are known to be among the most efficient methods for unconstrained optimization due to their advantages of simple structure, low storage, and nice numerical behavior. – The Wolfe–Powell line search: the stepsize αk satisfies the following two relations: f (xk + αkdk) – f (xk) ≤ δαkgkT dk and g(xk + αkdk)T dk ≥ σ gkT dk, (2) – The strong Wolfe–Powell line search: the stepsize αk satisfies both (1) and the following relation: g(xk + αkdk)T dk ≤ σ gkT dk . With the Wolfe–Powell line search, LSTT is proved to be globally convergent for uniformly convex functions.

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call