Abstract

Several recent works address the impact of inexact oracles in the convergence analysis of modern first-order optimization techniques, e.g. Bregman Proximal Gradient and Prox-Linear methods as well as their accelerated variants, extending their field of applicability. In this paper, we consider situations where the oracle's inexactness can be chosen upon demand, more precision coming at a computational price counterpart. Our main motivations arise from oracles requiring the solving of auxiliary subproblems or the inexact computation of involved quantities, e.g. a mini-batch stochastic gradient as a full-gradient estimate. We propose optimal inexactness schedules according to presumed oracle cost models and patterns of worst-case guarantees, covering among others convergence results of the aforementioned methods under the presence of inexactness. Specifically, we detail how to choose the level of inexactness at each iteration to obtain the best trade-off between convergence and computational investments. Furthermore, we highlight the benefits one can expect by tuning those oracles' accuracy instead of keeping it constant throughout. Finally, we provide extensive numerical experiments that support the practical interest of our approach, both in offline and online settings, applied to the Fast Gradient algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call