Abstract

In this paper, we show that when the alternating direction method of multipliers (ADMM) is extended directly to the 3-block separable convex minimization problems, it is convergent if one block in the objective possesses sub-strong monotonicity which is weaker than strong convexity. In particular, we estimate the globally linear convergence rate of the direct extension of ADMM measured by the iteration complexity under some additional conditions.

Highlights

  • Because there is still a gap between the empirical efficiency of the direct extension of alternating direction method of multipliers (ADMM) for a variety of applications and the lack of theoretical conditions that can both ensure the convergence of the direct extension of ADMM and be satisfied by applications, the main attention of this paper is paid to the study of the convergence of the direct extension of ADMM for the -block separable convex optimization problems.We consider the following separable convex minimization problem whose objective function is the sum of three functions without coupled variables: min θ (x ) + θ (x ) + θ (x ) ( )s.t

  • 1 Introduction Because there is still a gap between the empirical efficiency of the direct extension of ADMM for a variety of applications and the lack of theoretical conditions that can both ensure the convergence of the direct extension of ADMM and be satisfied by applications, the main attention of this paper is paid to the study of the convergence of the direct extension of ADMM for the -block separable convex optimization problems

  • We show that ( a)-( d) is convergent if one function in the objective of ( ) is sub-strongly monotone together with some minor restrictions on the coefficient matrices A, A, A, and the penalty parameter β, which explains why the direct extension of ADMM ( a)-( d) works well for some applications, even though there are not strong convex functions in such applications

Read more

Summary

Introduction

Because there is still a gap between the empirical efficiency of the direct extension of ADMM for a variety of applications and the lack of theoretical conditions that can both ensure the convergence of the direct extension of ADMM and be satisfied by applications, the main attention of this paper is paid to the study of the convergence of the direct extension of ADMM for the -block separable convex optimization problems. The scheme ( a)-( d) can be rewritten as xk + = arg min Lβ x , xk , xk , λk , x xk + = arg min Lβ xk + , x , xk , λk , λk+ = λk – β A xk + + A xk + + A xk – b , xk + = arg min Lβ xk + , xk + , x , λk+ In this manuscript, we show that ( a)-( d) is convergent if one function in the objective of ( ) is sub-strongly monotone together with some minor restrictions on the coefficient matrices A , A , A , and the penalty parameter β, which explains why the direct extension of ADMM ( a)-( d) works well for some applications, even though there are not strong convex functions in such applications. There exists a real number ρ ∈ ( , ) such that the matrix G is symmetric and positive definite

Proof Let
To prove such η
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call