Abstract

For a type of multi-block separable convex programming raised in machine learning and statistical inference, we propose a proximal alternating direction method of multiplier with partially parallel splitting, which has the following nice properties: (1) to alleviate the weight of the proximal terms, the restrictions imposed on the proximal parameters are relaxed substantively; (2) to maintain the inherent structure of the primal variables $$x_i(i=1,2,\ldots ,m)$$ , the relaxation parameter $$\gamma $$ is only attached to the update formula of the dual variable $$\lambda $$ . For the resulted method, we establish its global convergence and worst-case $$\mathcal {O}(1/t)$$ convergence rate in an ergodic sense, where t is the iteration counter. Finally, three numerical examples are given to illustrate the theoretical results obtained.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call