Abstract

We examine a class of stochastic mirror descent dynamics in the context of monotone variational inequalities (including Nash equilibrium and saddle-point problems). The dynamics under study are formulated as a stochastic differential equation, driven by a (single-valued) monotone operator and perturbed by a Brownian motion. The system’s controllable parameters are two variable weight sequences, that, respectively, pre- and post-multiply the driver of the process. By carefully tuning these parameters, we obtain global convergence in the ergodic sense, and we estimate the average rate of convergence of the process. We also establish a large deviations principle, showing that individual trajectories exhibit exponential concentration around this average.

Highlights

  • Dynamical systems governed by monotone operators play an important role in the fields of optimization, game theory (Nash equilibrium and generalized Nash equilibrium problems), fixed point theory, partial differential equations and many other fields of applied mathematics

  • A classical example of this arises in the study of gradient descent dynamics and its connection with Cauchy’s steepest descent algorithm—or, more generally, in the relation between the mirror descent (MD) class of algorithms [3] and dynamical systems derived from Bregman projections and Hessian Riemannian metrics [4,5,6]

  • We provide below a “large deviations” bound that shows that the ergodic gap process g(X (t)) is exponentially concentrated around its mean value: Theorem 4.3 Suppose (H1)–(H3) hold, and that (SMD) is started from the initial condition (s, y) = (0, 0)

Read more

Summary

Introduction

Dynamical systems governed by monotone operators play an important role in the fields of optimization (convex programming), game theory (Nash equilibrium and generalized Nash equilibrium problems), fixed point theory, partial differential equations and many other fields of applied mathematics. The study of the relationship between continuous- and discrete-time models has given rise to a vigor-. The starting point of much of this literature is that an iterative algorithm can be seen as a discretization of a continuous dynamical system. A classical example of this arises in the study of (projected) gradient descent dynamics and its connection with Cauchy’s steepest descent algorithm—or, more generally, in the relation between the mirror descent (MD) class of algorithms [3] and dynamical systems derived from Bregman projections and Hessian Riemannian metrics [4,5,6]

Problem Formulation and Related Literature
Contributions
Stochastic Mirror Descent Dynamics
Global Existence
Convergence Properties and Performance
The Small Noise Limit
Ergodic Convergence
Large Deviations
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.