Abstract
Inspired by the subgradient push method developed recently by Nedic et al. we present a distributed dual averaging push algorithm for constrained nonsmooth convex optimization over time-varying directed graph. Our algorithm combines the dual averaging method with the push-sum technique and achieves an $O(1/ \sqrt{k})$ convergence rate. Compared with the subgradient push algorithm, our algorithm, first, addresses the constrained problems, and, second, has a faster convergence rate, and, third, simplifies the convergence analysis. We also generalize the proposed algorithm so that input variables of subgradient oracles have guaranteed convergence.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.