Abstract

Distributed optimization algorithms are widely applied in distributed systems where the agents cooperate with each other to find the minimal solution of the problem over a connected network. In this paper, we introduce the adaptive step sizes that are computed automatically with variables and gradients of the last two iterates, which are independent of the underlying network topology and the function property. Moreover, we propose an accelerated algorithm based on the dynamic average consensus approach and the Barzilai-Borwein step sizes and further demonstrate the geometric convergence of the algorithm for the smooth and strongly convex functions. Finally, the numerical experiments are provided to validate the theoretical results and show the efficacy of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call