Abstract

In this paper, a distributed optimization problem is investigated via input feedforward passivity. First, an input-feedforward-passivity-based continuous-time distributed algorithm is proposed. It is shown that the error system of the proposed algorithm can be decomposed into a group of individual input feedforward passive (IFP) systems that interact with each other using output feedback information. Based on this IFP framework, convergence conditions of a suitable coupling gain are derived over weight-balanced and uniformly jointly strongly connected (UJSC) topologies. It is also shown that the IFP-based algorithm converges exponentially when the topology is strongly connected. Second, a novel distributed derivative feedback algorithm is proposed based on the passivation of IFP systems. While most works on directed topologies require knowledge of eigenvalues of the graph Laplacian, the derivative feedback algorithm is fully distributed, namely, it is robust against randomly changing weight-balanced digraphs with any positive coupling gain and without knowing any global information. Finally, numerical examples are presented to illustrate the proposed distributed algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call