Abstract

This study investigates distributed convex optimisation that accounts for the inequality constraints over unbalanced directed graphs. In distributed optimisation, agents exchange information on a network to obtain an optimal solution when they only know their own cost function. We propose a distributed primal-dual subgradient method based on a row stochastic weight matrix that is associated with a communication network. In the proposed method, the normalised left eigenvector of the weight matrix is estimated by a consensus algorithm. Then, the subgradient of the Lagrange function is scaled by the estimated values of the left eigenvector to compensate for the imbalance of the information flow in the network. We show that the pair of estimations for the primal and dual problems converge to an optimal primal-dual solution. We also show the relation between the convergence rate of the proposed algorithm and the step-size rule. A numerical example confirms the validity of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call