Abstract

We present a framework for generalizing the primal-dual gradient method, also known as the gradient descent ascent method, for solving convex-concave minimax problems. The framework is based on the observation that the primal-dual gradient method can be viewed as an inexact gradient method applied to the primal problem. Unlike the setting of traditional inexact gradient methods, the inexact gradient is computed by a dynamic inexact oracle, which is a discrete-time dynamical system whose output asymptotically approaches the exact gradient. For minimax problems, dynamic inexact oracles are capable of modeling a range of first-order methods for computing the gradient of the primal objective, which relies on solving the inner maximization problem. We provide a unified convergence analysis of gradient methods with dynamic inexact oracles and demonstrate its use in creating new accelerated primal-dual algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.