Abstract

Classical extragradient schemes and their stochastic counterpart represent a cornerstone for resolving monotone variational inequality problems. Yet, such schemes have a per-iteration complexity of two projections onto a convex set and require two evaluations of the map, the former of which could be relatively expensive. We consider two related avenues where the per-iteration complexity is significantly reduced: (i) A stochastic projected reflected gradient method requiring a single evaluation of the map and a single projection; and (ii) A stochastic subgradient extragradient method that requires two evaluations of the map, a single projection onto the associated feasibility set, and a significantly cheaper projection (onto a halfspace) computable in closed form. Under a variance-reduced framework reliant on a sample-average of the map based on an increasing batch-size, we prove almost sure convergence of the iterates to a random point in the solution set for both schemes. Additionally, non-asymptotic rate guarantees are derived for both schemes in terms of the gap function; notably, both rates match the best-known rates obtained in deterministic regimes. To address feasibility sets given by the intersection of a large number of convex constraints, we adapt both of the aforementioned schemes to a random projection framework. We then show that the random projection analogs of both schemes also display almost sure convergence under a weak-sharpness requirement; furthermore, without imposing the weak-sharpness requirement, both schemes are characterized by the optimal rate in terms of the gap function of the projection of the averaged sequence onto the set as well as the infeasibility of this sequence. Preliminary numerics support theoretical findings and the schemes outperform standard extragradient schemes in terms of the per-iteration complexity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call