Abstract

The alternating direction method of multipliers (ADMM) is an effective method for solving convex problems from a wide range of fields. At each iteration, the classical ADMM solves two subproblems exactly. However, in many applications, it is expensive or impossible to obtain the exact solutions of the subproblems. To overcome the difficulty, some proximal terms are added to the subproblems. This class of methods typically solves the original subproblem approximately and hence requires more iterations. This fact urges us to consider that a special proximal term can yield better results than the classical ADMM. In this paper, we propose a proximal ADMM whose regularization matrix in the proximal term is generated by the BFGS update (or limited memory BFGS) at every iteration. These types of matrices use second-order information of the objective function. The convergence of the proposed method is proved under certain assumptions. Numerical results are presented to demonstrate the effectiveness of the proposed proximal ADMM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call