Abstract
The Alternating Direction Method of Multipliers (ADMM) has gained a lot of attention for solving large-scale and objective-separable constrained optimization. However, the two-block variable structure of the ADMM still limits the practical computational efficiency of the method, because one big matrix factorization is needed at least once even for linear and convex quadratic programming. This drawback may be overcome by enforcing a multi-block structure of the decision variables in the original optimization problem. Unfortunately, the multi-block ADMM, with more than two blocks, is not guaranteed to be convergent. On the other hand, two positive developments have been made: first, if in each cyclic loop one randomly permutes the updating order of the multiple blocks, then the method converges in expectation for solving any system of linear equations with any number of blocks. Secondly, such a randomly permuted ADMM also works for equality-constrained convex quadratic programming even when the objective function is not separable. The goal of this paper is twofold. First, we add more randomness into the ADMM by developing a randomly assembled cyclic ADMM (RAC-ADMM) where the decision variables in each block are randomly assembled. We discuss the theoretical properties of RAC-ADMM and show when random assembling helps and when it hurts, and develop a criterion to guarantee that it converges almost surely. Secondly, using the theoretical guidance on RAC-ADMM, we conduct multiple numerical tests on solving both randomly generated and large-scale benchmark quadratic optimization problems, which include continuous, and binary graph-partition and quadratic assignment, and selected machine learning problems. Our numerical tests show that the RAC-ADMM, with a variable-grouping strategy, could significantly improve the computation efficiency on solving most quadratic optimization problems.
Highlights
In this paper we consider the linearly constrained convex minimization model with an objective function that is the sum of multiple separable functions and a coupled quadratic function: min x f (x) = xT H x+ cT x p s.t
In terms of run-time, for dense problem, RAC-Alternating Direction Method of Multipliers (ADMM) is 3 times faster compared with Matlab lasso and 7 times faster compared with glmnet
randomly permuted multi-block ADMM (RP-ADMM) is 6 times faster compared with Matlab lasso, and 14 times faster compared with glmnet
Summary
The two-block variable structure of the ADMM still limits the practical computational efficiency of the method, because one factorization of a large matrix is needed at least once even for linear and convex quadratic programming (e.g., [45,65]) This drawback may be overcome by enforcing a multi-block structure of the decision variables in the original optimization problem. In [17] the authors focused on solving the linearly constrained convex optimization with coupled convex quadratic objective, and proved the convergence in expectation of RP-ADMM for the non separable multi-block convex quadratic programming, which is a much broader class of computational problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.