Abstract

We consider the linearly constrained separable convex minimization model, whose objective function is the sum of three convex functions without coupled variables. The generalized alternating direction method of multipliers (ADMM) is a very effective approach for solving this kind of problem. Recently, the literature of ADMM focus on three or more blocks. [14] has shown a global linear convergence of the generalized ADMM when the number of blocks is more than two by using an error bound analysis method. In contrast, in this paper we make the different assumptions and prove the linear convergence of the generalized ADMM with another approach. This paper shows the global convergence of the generalized ADMM when only one function is assumed to be strongly convex. Moreover, it also implies that global linear convergence can be guaranteed when two of the three separable convex functions are strongly convex and one of them has Lipschitz continuous gradient, along with certain rank assumptions on the linear constraint matrices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call