Abstract

In this paper, we devise a communication-efficient decentralized algorithm, called as communication-censored ADMM (COCA), to solve a convex consensus optimization problem defined over a network. Similar to popular decentralized consensus optimization algorithms such as ADMM (abbreviated for the alternating direction method of multipliers), at every iteration of COCA, a node exchanges its local variable with neighbors, and then updates its local variable according to the received neighboring variables and its local cost function. A different feature of COCA is that a node is not allowed to transmit its local variable to neighbors, if this variable is not sufficiently different to the previously transmitted one. The sufficiency of the difference is evaluated by a properly designed censoring function. Though this censoring strategy may slow down the optimization process, it effectively reduces the communication cost. We prove that when the censoring function is properly chosen, COCA converges to the optimal solution of the convex consensus optimization problem. Further, if the local cost functions are strongly convex, COCA has a fast linear convergence rate. Numerical experiments demonstrate that, given a target solution accuracy, COCA is able to significantly reduce the overall communication cost compared to existing algorithms including ADMM, and hence fits for applications where network communication is a bottleneck.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call