Abstract

In this paper, a new family of Dai-Liao–type conjugate gradient methods are proposed for unconstrained optimization problem. In the new methods, the modified secant equation used in [H. Yabe and M. Takano, Comput. Optim. Appl. 28 (2004) 203–225] is considered in Dai and Liao’s conjugacy condition. Under some certain assumptions, we show that our methods are globally convergent for general functions with strong Wolfe line search. Numerical results illustrate that our proposed methods can outperform some existing ones.

Highlights

  • Consider the following unconstrained optimization problem min f (x), x ∈ Rn, (1.1)where the objective function f : Rn → R is continuously differentiable and its gradient g(x) is available

  • Its efficient and effective numerical solution methods have been intensively studied in the literature, including the spectral gradient methods [5, 15], conjugate gradient methods [4, 13] and memoryless BFGS methods [16]

  • Since exact line search for searching αk is usually expensive and impractical, the strong Wolfe inexact line search is often considered in the convergence analysis and implementation of nonlinear conjugate gradient methods

Read more

Summary

Introduction

Where the objective function f : Rn → R is continuously differentiable and its gradient g(x) is available. Since exact line search for searching αk is usually expensive and impractical, the strong Wolfe inexact line search is often considered in the convergence analysis and implementation of nonlinear conjugate gradient methods. It aims to find a step size αk satisfying the following two strong Wolfe conditions f (︀xk + αkdk)︀ ≤ f (︀xk)︀ + ρgkT dk, |gkT+1dk| ≤ σ|gkT dk|,.

New Dai-Liao–Type methods
Convergence Analysis
A NEW FAMILY OF DAI-LIAO CONJUGATE GRADIENT METHODS
Numerical Experiments
Conclusions
Findings
Methods
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call