Abstract

It is well-known that primal first-order algorithms achieve sublinear (linear) convergence for smooth convex (smooth strongly convex) constrained minimization. However, these methods encounter numerical difficulties when the primal feasible set is complicated, since they require exact projection onto this set. Algorithmic alternatives to convex problems with complicated feasible set are the dual first-order methods. Dual methods are able to handle easily complicated constraints, but they have difficulties in convergence when the norm of the optimal Lagrange multiplier is large, since this norm appears linearly in the convergence estimates of these methods. Moreover, they have typically sublinear convergence rate in an average primal sequence, even when the primal problem has smooth and strongly convex objective function. Motivated by these issues, in this paper, we analyze the convergence of primal first-order methods with inexact projections for solving constrained convex problems with smooth and then strongly convex objective function. In particular, we consider the inexact variants of Projected Gradient and Projected Fast Gradient methods, where instead of computing an exact projection onto the complicated primal feasible set, an approximate projection, not necessarily feasible, is used. We prove that we can still achieve similar convergence rates for these inexact projection first-order algorithms with those given in the exact projection settings, provided that the approximate projection is sufficiently accurate. Our convergence analysis allows us to derive explicitly the accuracy of the inexact projection and the number of iterations we need to perform in order to obtain an approximate solution for our convex problem. Finally, practical performance on random quadratic problems show encouraging results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call