Abstract

Dual methods can handle easily complicated constraints in convex problems, but they have typically slow (sublinear) convergence rate in an average primal point, even when the original problem has smooth strongly convex objective function. Primal projected gradient-based methods achieve linear convergence for constrained, smooth and strongly convex optimization, but it is difficult to implement them, since they require exact projections onto the complicated primal feasible set. Therefore, in the present work we consider an inexact projection primal gradient algorithm for convex problems having strongly convex objective function and with Lipschitz continuous gradient. More precisely, we consider the Projected Gradient algorithm, where instead of an exact projection onto the complicated primal feasible set, an approximate projection, which is not necessarily feasible, is computed. We show that we can still achieve linear convergence for this scheme, provided that the approximate projection is computed with sufficient accuracy. Practical performance on quadratic programs coming from model predictive control applications shows encouraging results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call