Proximal alternating direction method of multipliers (PADMM) is a classical primal-dual splitting method for solving separable convex optimization problems with linear equality constraints, which have numerous applications in, e.g., signal and image processing, machine learning, and statistics. In this paper, we propose a new variant of PADMM, called PADMC, whose proximal centers are constructed by convex combinations of the iterates. PADMC is able to take advantage of problem structures and preserves the desirable properties of the classical PADMM. We establish iterate convergence as well as [Formula: see text] ergodic and [Formula: see text] nonergodic sublinear convergence rate results measured by function residual and feasibility violation, where [Formula: see text] denotes the iteration number. Moreover, we propose two fast variants of PADMC, one achieves faster [Formula: see text] ergodic convergence rate when one of the component functions is strongly convex, and the other ensures faster [Formula: see text] nonergodic convergence rate measured by constraint violation. Finally, preliminary numerical results on the LASSO and the elastic-net regularization problems are presented to demonstrate the performance of the proposed methods.
Read full abstract