Abstract

The conjugate gradient method is very effective in solving large-scale unconstrained optimal problems. In this paper, on the basis of the conjugate parameter of the conjugate descent (CD) method and the second inequality in the strong Wolfe line search, two new conjugate parameters are devised. Using the strong Wolfe line search to obtain the step lengths, two modified conjugate gradient methods are proposed for general unconstrained optimization. Under the standard assumptions, the two presented methods are proved to be sufficient descent and globally convergent. Finally, preliminary numerical results are reported to show that the proposed methods are promising.

Highlights

  • We focus our attention on the ideas of the JMJ and conjugate descent (CD) methods as well as the strong Wolfe line search

  • If there exists a constant c > 0 such that gTk dk ≤ − c‖gk‖2, ∀k ≥ 1, we say that the search direction dk of the method satisfies the sufficient descent condition, which is often used to analyze the convergence properties of CGMs for this kind of problem (1) under inexact line search, see e.g., [10,11,12,13,14]

  • LMYCD1 method, we show that the LMYCD1 method has similar properties to that of the DY method, which is very important to analyze the global convergence property of the method

Read more

Summary

Introduction

Are generated by xk+1 xk + αkdk. First, the step length αk > 0 is usually yielded by a suitable inexact line search along the search direction dk, such as the Wolfe line search. Method keeps the descent property at each iteration and converges globally for general nonconvex functions under the Wolfe line search. By the second inequality of the strong Wolfe line search, it follows that |gTk dk− 1|/(− dTk− 1gk− 1) ≤ σ if dTk− 1gk− 1 < 0.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call