Abstract

We present a framework for generalizing the primal-dual gradient method, also known as the gradient descent ascent method, for solving convex-concave minimax problems. The framework is based on the observation that the primal-dual gradient method can be viewed as an inexact gradient method applied to the primal problem. Unlike the setting of traditional inexact gradient methods, the inexact gradient is computed by a dynamic inexact oracle, which is a discrete-time dynamical system whose output asymptotically approaches the exact gradient. For minimax problems, dynamic inexact oracles are capable of modeling a range of first-order methods for computing the gradient of the primal objective, which relies on solving the inner maximization problem. We provide a unified convergence analysis of gradient methods with dynamic inexact oracles and demonstrate its use in creating new accelerated primal-dual algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call