Abstract

In this article, we present a parallelized primal-dual algorithm for solving constrained convex optimization problems. The algorithm is “block-based,” in which vectors of primal and dual variables are partitioned into blocks, each of which is updated only by a single processor. We consider four behaviors that could be asynchronous: 1) updates to primal variables; 2) updates to dual variables; 3) communications of primal variables; and 4) communications of dual variables. We show that any amount of asynchrony in the communications of dual variables can preclude convergence, though the other forms of asynchrony are permitted. A first-order primal-dual update law is then developed and shown to be robust to these other forms of asynchrony. We next derive convergence rates to an approximate Lagrangian saddle point in terms of the operations agents execute, without specifying any timing or pattern with which they must be executed. The distance between the approximate solution we obtain and the exact solution is explicitly bounded. Convergence rates include an “asynchrony penalty” that we quantify and present ways to mitigate. Numerical results illustrate these developments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call