This paper proposes and analyzes an inexact variant of the proximal generalized alternating direction method of multipliers (ADMM) for solving separable linearly constrained convex optimization problems. In this variant, the first subproblem is approximately solved using a relative error condition whereas the second one is assumed to be easy to solve. In many ADMM applications, one of the subproblems has a closed-form solution; for instance, $$\ell _1$$ regularized convex composite optimization problems. The proposed method possesses iteration-complexity bounds similar to its exact version. More specifically, it is shown that, for a given tolerance $$\rho >0$$ , an approximate solution of the Lagrangian system associated to the problem under consideration is obtained in at most $$\mathcal {O}(1/\rho ^2)$$ (resp. $$\mathcal {O}(1/\rho )$$ in the ergodic case) iterations. Numerical experiments are presented to illustrate the performance of the proposed scheme.
Read full abstract