Abstract

In this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.

Highlights

  • Consider the following unconstrained optimization problem: min f (x), x ∈ Rn, (1)where f : Rn → R is a continuously differentiable function, bounded below and its gradient is denoted by g(x) = ∇f (x)

  • By focusing on the above research, we are interested in developing a new accelerated conjugate gradient method (NACG) for large-scale unconstrained optimization

  • 5 Conclusions Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, due to their simplicity and low storage

Read more

Summary

Introduction

The line search in conjugate gradient methods is usually based on the general Wolfe conditions [33, 34], f (xk + αkdk) – f (xk) ≤ ραkgkTdk, (4) Based on the Dai–Liao conjugacy condition (7), [8] introduced the conjugate gradient parameter βkDL as follows: βkDL = By focusing on the above research, we are interested in developing a new accelerated conjugate gradient method (NACG) for large-scale unconstrained optimization.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call