Abstract
The Barzilai–Borwein conjugate gradient methods, which were first proposed by Dai and Kou (Sci China Math 59(8):1511–1524, 2016), are very interesting and very efficient for strictly convex quadratic minimization. In this paper, we present an efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. Motivated by the Barzilai–Borwein method and the linear conjugate gradient method, we derive a new search direction satisfying the sufficient descent condition based on a quadratic model in a two-dimensional subspace, and design a new strategy for the choice of initial stepsize. A generalized Wolfe line search is also proposed, which is nonmonotone and can avoid a numerical drawback of the original Wolfe line search. Under mild conditions, we establish the global convergence and the R-linear convergence of the proposed method. In particular, we also analyze the convergence for convex functions. Numerical results show that, for the CUTEr library and the test problem collection given by Andrei, the proposed method is superior to two famous conjugate gradient methods, which were proposed by Dai and Kou (SIAM J Optim 23(1):296–320, 2013) and Hager and Zhang (SIAM J Optim 16(1):170–192, 2005), respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.