This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM either converges to a consensus or cycles with a finite period after a finite-time iteration. In the cyclic case, local quantized variables have the same mean over one period and hence each node can also reach a consensus. We then obtain an upper bound on the consensus error which depends only on the quantization resolution and the average degree of the network. Finally, we propose a two-stage algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the two-stage algorithm, without picking small algorithm parameter, has consensus errors that are typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.
Read full abstract