Abstract

Stochastic generalizations of the extragradient method are complicated by a key challenge: the scheme requires two projections on a convex set and two evaluations of the map for every major iteration. We consider two related avenues where every iteration requires a single projection: (i) A projected reflected gradient (PRG) method requiring a single evaluation of the map and a single projection; and (ii) A modified backward-forward splitting (MBFS) method that requires two evaluations of the map and a single projection. We make the following contributions: (a) We prove almost sure convergence of the iterates to a random point in the solution set for the stochastic PRG scheme under a weak sharpness requirement; (b) We prove that the mean of the gap function associated with the averaged sequence diminishes to zero at the optimal rate of O(1/√N) for both schemes where N is the iteration index.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call