Abstract

The delayed weighted gradient method, recently introduced in Oviedo-Leon (Comput Optim Appl 74:729–746, 2019), is a low-cost gradient-type method that exhibits a surprisingly and perhaps unexpected fast convergence behavior that competes favorably with the well-known conjugate gradient method for the minimization of convex quadratic functions. In this work, we establish several orthogonality properties that add understanding to the practical behavior of the method, including its finite termination. We show that if the $$n\times n$$ real Hessian matrix of the quadratic function has only $$p<n$$ distinct eigenvalues, then the method terminates in p iterations. We also establish an optimality condition, concerning the gradient norm, that motivates the future use of this novel scheme when low precision is required for the minimization of non-quadratic functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call