Abstract

AbstractThe Barzilai and Borwein (BB) gradient method has achieved a lot of attention since it performs much more better than the classical steepest descent method. In this paper, we analyze a positive BB-like gradient stepsize and discuss its possible uses. Specifically, we present an analysis of the positive stepsize for two-dimensional strictly convex quadratic functions and prove the R-superlinear convergence under some assumption. Meanwhile, we extend BB-like methods for solving symmetric linear systems and find that a variant of the positive stepsize is very useful in the context. Some useful discussions on the positive stepsize are also given.KeywordsUnconstrained optimizationBarzilai and Borwein gradient methodQuadratic function R-superlinear convergenceCondition number

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call