Abstract
In this paper, the distributed zero-sum differential game problem for multi-agent nonlinear systems is investigated by using adaptive dynamic programming (ADP) technique, which brings together the distributed control, zero-sum differential graphical game and ADP. To find the cooperative Nash-equilibrium solutions of the associated coupled Hamilton-Jacobi-Isaacs (HJI) equations, the ADP technique is employed, in which a critic network is constructed to approximate the cooperative cost function online with a novel updating law. By utilizing the Lyapunov direct method, the closed-loop system and weight estimation errors are guaranteed to be uniformly ultimately bounded (UUB). Finally, simulation results demonstrate the effectiveness of the proposed method.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have