Abstract

Inspired by the subgradient push method developed recently by Nedic et al. we present a distributed dual averaging push algorithm for constrained nonsmooth convex optimization over time-varying directed graph. Our algorithm combines the dual averaging method with the push-sum technique and achieves an $O(1/ \sqrt{k})$ convergence rate. Compared with the subgradient push algorithm, our algorithm, first, addresses the constrained problems, and, second, has a faster convergence rate, and, third, simplifies the convergence analysis. We also generalize the proposed algorithm so that input variables of subgradient oracles have guaranteed convergence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call