Abstract

We develop a general approach to convergence analysis of feasible descent methods in the presence of perturbations. The important novel feature of our analysis is that perturbations need not tend to zero in the limit. In that case, standard convergence analysis techniques are not applicable. Therefore, a new approach is needed. We show that, in the presence of perturbations, a certain ε-approximate solution can be obtained, where ε depends linearly on the level of perturbations. Applications to the gradient projection, proximal minimization, extragradient and incremental gradient algorithms are described.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call