Abstract
Much of the classical work on algorithms for multi-armed bandits focuses on rewards that are stationary over time. By contrast, we study multi-armed bandit (MAB) games, where the rewards obtained by an agent also depend on how many other agents choose the same arm (as might be the case in many competitive or cooperative scenarios). Such systems are naturally nonstationary due to the interdependent evolution of agents, and in general MAB games can be intractable to analyze using typical equilibrium concepts (such as perfect Bayesian equilibrium).We introduce a general model of multi-armed bandit games, and study the dynamics of these games under a large system approximation. We investigate conditions under which the bandit dynamics have a steady state we refer to as a mean field steady state (MFSS). In an MFSS, the proportion of agents playing the various arms, called the population profile, is assumed stationary over time; the steady state definition then requires a consistency check that this stationary profile arises from the policies chosen by the agents.We establish the following results in the paper. First, we establish existence of an MFSS under broad conditions. Second, we show under a contraction condition that the MFSS is unique, and that the population profile converges to it from any initial state. Finally, we show that under the contraction condition, MFSS is a good approximation to the behavior of finite systems with many agents. The contraction condition requires that the agent population regenerates sufficiently often, and that the sensitivity of the reward function to the population profile is low enough. Through numerical experiments, we find that in settings with negative externalities among the agents, convergence obtains even when our condition is violated; while in settings with positive externalities among the agents, our condition is tighter.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.