Abstract

Vector extrapolations for fixed-point iterations are shown to converge faster when their step lengths are computed from two or three consecutive maps alternately. Based on this finding, cyclic extrapolation methods are proposed which require few objective function evaluations, no matrix inversion, and little extra memory. They are efficient in high-dimensional contexts and do not require problem-specific adaptation. A convergence analysis is done for symmetric positive definite linear systems and for contraction mappings. The proposed methods rivaled common quasi-Newton alternatives in eight mapping applications that included gradient descent for constrained and unconstrained optimization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call