Abstract

Stochastic generalizations of the extragradient method are complicated by a key challenge: the scheme requires two projections on a convex set and two evaluations of the map for every major iteration. We consider two related avenues where every iteration requires a single projection: (i) A projected reflected gradient (PRG) method requiring a single evaluation of the map and a single projection; and (ii) A modified backward-forward splitting (MBFS) method that requires two evaluations of the map and a single projection. We make the following contributions: (a) We prove almost sure convergence of the iterates to a random point in the solution set for the stochastic PRG scheme under a weak sharpness requirement; (b) We prove that the mean of the gap function associated with the averaged sequence diminishes to zero at the optimal rate of O(1/√N) for both schemes where N is the iteration index.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.