AbstractIn 2012, Censor et al. (Extensions of Korpelevich’s extragradient method for the variational inequality problem in Euclidean space. Optimization 61(9):1119–1132, 2012b) proposed the two-subgradient extragradient method (TSEGM). This method does not require computing projection onto the feasible (closed and convex) set, but rather the two projections are made onto some half-space. However, the convergence of the TSEGM was puzzling and hence posted as open question. Very recently, some authors were able to provide a partial answer to the open question by establishing weak convergence result for the TSEGM though under some stringent conditions. In this paper, we propose and study an inertial two-subgradient extragradient method (ITSEGM) for solving monotone variational inequality problems (VIPs). Under more relaxed conditions than the existing results in the literature, we prove that proposed method converges strongly to a minimum-norm solution of monotone VIPs in Hilbert spaces. Unlike several of the existing methods in the literature for solving VIPs, our method does not require any linesearch technique, which could be time-consuming to implement. Rather, we employ a simple but very efficient self-adaptive step size method that generates a non-monotonic sequence of step sizes. Moreover, we present several numerical experiments to demonstrate the efficiency of our proposed method in comparison with related results in the literature. Finally, we apply our result to image restoration problem. Our result in this paper improves and generalizes several of the existing results in the literature in this direction.
Read full abstract