Abstract

In this paper, we propose two proximal gradient algorithms with variance reduction for stochastic mixed variational inequality problems. One is a proximal extragradient algorithm and another is a proximal forward–backward–forward algorithm. Under the monotonicity assumption on the mapping F and other moderate conditions, we derive some asymptotic convergence properties and O(1/k) convergence rate in terms of the restricted gap function values for the proposed algorithms. Furthermore, under the bounded metric subregularity condition, we investigate the linear convergence rate and oracle complexity bounds for the proposed algorithms when the sample-size increases at a geometric rate. If the sample-size increases at a polynomial rate of ⌈k+1⌉−s with s>0, the mean-squared distance of the iterates to the solution set decays at a corresponding polynomial rate, while the iterations and oracle complexities to obtain an ε-solution are O(1/ε1/s) and O(1/ε1+1/s) respectively. Finally, some numerical experiments on stochastic network games and traffic assignment problems indicate that the proposed algorithms are efficient.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call