Abstract
For solving monotone inclusion problems, we propose an inertial under-relaxed version of the relative-error hybrid proximal extragradient method. We study the asymptotic convergence of the method, as well as its nonasymptotic global convergence rates in terms of iteration complexity. We analyze the new method under more flexible assumptions than existing ones, both on the extrapolation and on the relative-error parameters. The approach is applied to two types of forward-backward type methods for solving structured monotone inclusions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have