Abstract

In this paper, we consider a network of processors that want to cooperatively solve a large-scale, convex optimization problem. Each processor has knowledge of a local cost function that depends only on a local variable. The goal is to minimize the sum of the local costs, while making the variables satisfy both local constraints and a global coupling constraint. We propose a simple, fully distributed algorithm, that works in a random, time-varying communication model, where at each iteration multiple edges are randomly drawn from an underlying graph. The algorithm is interpreted as a primal decomposition scheme applied to an equivalent problem reformulation. Almost sure convergence to the optimal cost of the original problem is proven by resorting to approaches from block subgradient methods. Specifically, the communication structure is mapped to a block structure, where the blocks correspond to the graph edges and are randomly selected at each iteration. Moreover, an almost sure asymptotic primal recovery property, with no averaging mechanisms, is shown. A numerical example corroborates the theoretical analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call