Abstract

We describe convergence acceleration schemes for multistep optimization algorithms where the underlying fixed-point operator is not symmetric. In particular, our analysis handles algorithms with momentum terms such as Nesterov’s accelerated method or primal-dual methods. The acceleration technique combines previous iterates through a weighted sum, whose coefficients are computed via a simple linear system. We analyze performance in both online and offline modes, and we study in particular a variant of Nesterov’s method that uses nonlinear acceleration at each iteration. We use Crouzeix’s conjecture to show that acceleration performance is controlled by the solution of a Chebyshev problem on the numerical range of a non-symmetric operator modeling the behavior of iterates near the optimum. Numerical experiments are detailed on logistic regression problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call