Abstract

This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). When rounding quantization is employed, a distributed ADMM algorithm is shown to converge to a consensus within 3 + ⌈log1+δ Ω⌉ iterations where δ > 0 depends on the network topology and O is a polynomial of the quantization resolution, the agents' data and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. To minimize the consensus error, our final algorithm uses dithered quantization to obtain a good starting point and then adopts rounding quantization to reach a consensus. Simulations show that the consensus error of this algorithm is typically less than one quantization resolution for all connected networks with agents' data of arbitrary magnitudes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call