Abstract

This paper introduces a new class of multi-agent discrete-time dynamical games known as dynamic graphical games, where the interactions between agents are prescribed by a communication graph structure. The graphical game results from multi-agent dynamical systems, where pinning control is used to make all the agents synchronize to the state of a command generator or target agent. The relation of dynamic graphical games and standard multi-player games is shown. A new notion of Interactive Nash equilibrium is introduced which holds if the agents are all in Nash equilibrium and the graph is strongly connected. The paper brings together discrete Hamiltonian mechanics, distributed multi-agent control, optimal control theory, and game theory to formulate and solve these multi-agent graphical games. The relationships between the discrete-time Hamilton Jacobi equation and discrete-time Bellman equation are used to formulate a discrete-time Hamilton Jacobi Bellman equation for dynamic graphical games. Proofs of Nash, stability, and convergence are given. A reinforcement learning value iteration algorithm is given to solve the dynamic graphical games.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call