Abstract

In this paper, we establish a strong convergence theorem regarding a regularized variant of the projected subgradient method for nonsmooth, nonstrictly convex minimization in real Hilbert spaces. Only one projection step is needed per iteration and the involved stepsizes are controlled so that the algorithm is of practical interest. To this aim, we develop new techniques of analysis which can be adapted to many other non-Fejerian methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call