Abstract

This article proposes a distributed optimization method for minimizing the sum of smooth and strongly convex functions with a finite communication bandwidth. Each agent has a state and an auxiliary variable to estimate the optimal solution and the average gradient of the global cost function. To cooperatively estimate the optimal solution, agents exchange the states and the auxiliary variables with their neighbors over weight-balanced networks by a dynamic encoding and decoding scheme. After the information exchanges, each agent locally updates the own state and auxiliary variable by a quantized gradient-tracking algorithm. We show that the state updated by the proposed quantized algorithm converges to the optimal solution at a linear convergence rate. We also show a sufficient condition for guaranteeing a finite communication bandwidth.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call