Abstract

This article proposes a distributed optimization method for minimizing the sum of smooth and strongly convex functions with a finite communication bandwidth. Each agent has a state and an auxiliary variable to estimate the optimal solution and the average gradient of the global cost function. To cooperatively estimate the optimal solution, agents exchange the states and the auxiliary variables with their neighbors over weight-balanced networks by a dynamic encoding and decoding scheme. After the information exchanges, each agent locally updates the own state and auxiliary variable by a quantized gradient-tracking algorithm. We show that the state updated by the proposed quantized algorithm converges to the optimal solution at a linear convergence rate. We also show a sufficient condition for guaranteeing a finite communication bandwidth.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.