Abstract

Combined with non-monotone line search, the Barzilai and Borwein (BB) gradient method has been successfully extended for solving unconstrained optimization problems and is competitive with conjugate gradient methods. In this paper, we establish the R-linear convergence of the BB method for any-dimensional strongly convex quadratics. One corollary of this result is that the BB method is also locally R-linear convergent for general objective functions, and hence the stepsize in the BB method will always be accepted by the non-monotone line search when the iterate is close to the solution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call