Abstract
Though quasi-Newton methods have been extensively studied in the literature, they either suffer from local convergence or use a series of line searches for global convergence. In this work, we propose a line search free greedy quasi-Newton method with adaptive steps and establish explicit nonasymptotic bounds for both the global convergence rate and local superlinear rate. Our novel idea lies in the design of multiple greedy quasi-Newton updates to control the Hessian approximation error and a simple mechanism to adjust stepsizes to ensure function improvement per iteration. The global superlinear convergence <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sup> of our method is validated via numerical experiments.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have