Abstract

In this paper, we consider a stochastic variational inequality, in which the mapping involved is an expectation of a given random function. Inspired by the work of He (Appl Math Optim 35:69–76, 1997) and the extragradient method proposed by Iusem et al. (SIAM J Optim 29:175–206, 2019), we propose an infeasible projection algorithm with line search scheme, which can be viewed as a modification of the above-mentioned method of Iusem et al. In particular, in the correction step, we replace the projection by computing search direction and stepsize, that is, we need only one projection at each iteration, while the method of Iusem et al. requires two projections at each iteration. Moreover, we use dynamic sampled scheme with line search to cope with the absence of Lipschitz constant and choose the stepsize to be bounded away from zero and the direction to be a descent direction. In the process of stochastic approximation, we iteratively reduce the variance of a stochastic error. Under appropriate assumptions, we derive some properties related to convergence, convergence rate, and oracle complexity. In particular, compared with the method of Iusem et al., our method uses less projections and has the same iteration complexity, which, however, has a higher oracle complexity for a given tolerance in a finite dimensional space. Finally, we report some numerical experiments to show its efficiency.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.