Abstract

Distributed optimization algorithms are widely applied in distributed systems where the agents cooperate with each other to find the minimal solution of the problem over a connected network. In this paper, we introduce the adaptive step sizes that are computed automatically with variables and gradients of the last two iterates, which are independent of the underlying network topology and the function property. Moreover, we propose an accelerated algorithm based on the dynamic average consensus approach and the Barzilai-Borwein step sizes and further demonstrate the geometric convergence of the algorithm for the smooth and strongly convex functions. Finally, the numerical experiments are provided to validate the theoretical results and show the efficacy of the proposed algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.