Abstract

In this paper, we consider a network of processors that want to cooperatively solve a large-scale, convex optimization problem. Each processor has knowledge of a local cost function that depends only on a local variable. The goal is to minimize the sum of the local costs, while making the variables satisfy both local constraints and a global coupling constraint. We propose a simple, fully distributed algorithm, that works in a random, time-varying communication model, where at each iteration multiple edges are randomly drawn from an underlying graph. The algorithm is interpreted as a primal decomposition scheme applied to an equivalent problem reformulation. Almost sure convergence to the optimal cost of the original problem is proven by resorting to approaches from block subgradient methods. Specifically, the communication structure is mapped to a block structure, where the blocks correspond to the graph edges and are randomly selected at each iteration. Moreover, an almost sure asymptotic primal recovery property, with no averaging mechanisms, is shown. A numerical example corroborates the theoretical analysis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.