Abstract

This paper aims to study a majorized alternating direction method of multipliers with indefinite proximal terms (iPADMM) for convex composite optimization problems. We show that the majorized iPADMM for 2-block convex optimization problems converges globally under weaker conditions than those used in the literature and exhibits a linear convergence rate under a local error bound condition. Based on these, we establish the linear rate convergence results for a symmetric Gaussian-Seidel based majorized iPADMM, which is designed for multi-block composite convex optimization problems. Moreover, we apply the majorized iPADMM to solve different types of regularized logistic regression problems. The numerical results on both synthetic and real datasets demonstrate the efficiency of the majorized iPADMM and also illustrate the effectiveness of the introduced indefinite proximal terms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call