Abstract

The alternating direction method of multipliers (ADMM) is an effective method for solving two-block separable convex problems and its convergence is well understood. When either the involved number of blocks is more than two, or there is a nonconvex function, or there is a nonseparable structure, ADMM or its directly extend version may not converge. In this paper, we proposed an ADMM-based algorithm for nonconvex multiblock optimization problems with a nonseparable structure. We show that any cluster point of the iterative sequence generated by the proposed algorithm is a critical point, under mild condition. Furthermore, we establish the strong convergence of the whole sequence, under the condition that the potential function satisfies the Kurdyka–Łojasiewicz property. This provides the theoretical basis for the application of the proposed ADMM in the practice. Finally, we give some preliminary numerical results to show the effectiveness of the proposed algorithm.

Highlights

  • Introductionalternating direction method of multipliers (ADMM) or its directly extend version may not converge, when either the involved number of blocks is more than two, or there is a nonconvex function, or there is a nonseparable structure

  • We report the number of iterations (“Iter”.), the computing time in seconds (“Time”) and the objective function value (“f-val”)

  • We propose a new algorithm called linear Bregman alternating direction method of multipliers (ADMM) for the three-blocks optimization problem with the nonseparable structure. e proposed algorithm integrates the basic ideas of the linearization technology and regularization technology

Read more

Summary

Introduction

ADMM or its directly extend version may not converge, when either the involved number of blocks is more than two, or there is a nonconvex function, or there is a nonseparable structure. Li and Pong [8] studied the convergence of ADMM for some special two-block nonconvex models, where one of the matrices A and B is an identity matrix. Wang et al [11] studied the convergence of the ADMM for nonconvex nonsmooth optimization with a nonseparable structure. Guo et al [4, 5] studied the convergence of classical ADMM for two-block and multiblock nonconvex models where one of the matrices is an identity matrix. Yang et al [13] studied the convergence of the ADMM for a nonconvex optimization model which come from the background/foreground extraction

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call