Abstract

In this paper, we introduce a fast row-stochastic decentralized algorithm, referred to as FRSD, to solve consensus optimization problems over directed communication graphs. The proposed algorithm only utilizes row-stochastic weights, leading to certain practical advantages in broadcast communication settings over those requiring column-stochastic weights. Under the assumption that each node-specific function is smooth and strongly convex, we show that the FRSD iterate sequence converges with a linear rate to the optimal consensus solution. In contrast to the existing methods for directed networks, FRSD enjoys linear convergence without employing a gradient tracking (GT) technique explicitly, rather it implements GT implicitly with the use of a novel momentum term, which leads to a significant reduction in communication and storage overhead for each node when FRSD is implemented for solving high-dimensional problems over small-to-medium scale networks. In the numerical tests, we compare FRSD with other state-of-the-art methods, which use row-stochastic and/or column-stochastic weights.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call