Abstract

ABSTRACT The alternating direction method of multipliers (ADMM) is a classical effective method for solving two-block convex optimization subject to linear constraints. However, its convergence may not be guaranteed for multiple-block case without additional assumptions. One remedy might be the block-wise ADMM (BADMM), in which the variables are regrouped into two groups firstly and then the augmented Lagrangian function is minimized w.r.t. each block variable by the following scheme: using a Gauss–Seidel fashion to update the variables between each group, while using a Jacobi fashion to update the variables within each group. In order to derive its convergence property, a special proximal term is added to each subproblem. In this paper, we propose a new partial PPA block-wise ADMM where we only need to add proximal terms to the subproblems in the first group. At the end of each iteration, an extension step on all variables is performed with a fixed step size. As the subproblems in the second group are unmodified, the resulting sequence might yield better quality as well as potentially faster convergence speed. Preliminary experimental results show that the new algorithm is empirically effective on solving both synthetic and real problems when it is compared with several very efficient ADMM-based algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call