Abstract

We consider distributed optimization with smooth convex objective functions defined on an undirected connected graph. Inspired by mirror descent mehod and RLC circuits, we propose a novel distributed mirror descent method. Compared with mirror-prox method, our algorithm achieves the same O (1/k) iteration complexity with only half the computation cost per iteration. We further extend our results to cases where a) gradients are corrupted by stochastic noise, and b) objective function is composed of both smooth and non-smooth terms. We demonstrate our theoretical results via numerical experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call