Abstract

The linearized alternating direction method of multipliers (ADMM), with indefinite proximal regularization, has been proved to be efficient for solving separable convex optimization subject to linear constraints. In this paper, we present a generalization of linearized ADMM (G-LADMM) to solve two-block separable convex minimization model, which linearizes all the subproblems by choosing a proper positive-definite or indefinite proximal term and updates the Lagrangian multiplier twice in different ways. Furthermore, the proposed G-LADMM can be expressed as a proximal point algorithm (PPA), and all the subproblems are just to estimate the proximity operator of the function in the objective. We specify the domain of the proximal parameter and stepsizes to guarantee that G-LADMM is globally convergent. It turns out that our convergence domain of the proximal parameter and stepsizes is significantly larger than other convergence domains in the literature. The numerical experiments illustrate the improvements of the proposed G-LADMM to solve LASSO and image decomposition problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call