Abstract

This paper analyzes learning in multi-player noncooperative games with risky payoffs. The goal of the paper is to assess the relative importance of stochastic payoffs and expected payoffs in the learning process. A general learning model which nests several variations of reinforcement learning, belief-based learning, and experience-weighted attraction learning is used to analyze behavior in coordination game and prisonerʼs dilemma experiments with probabilistic payoffs. In all experiments, some subjects learn from past lottery outcomes, though the importance of these stochastic payoffs relative to expected payoffs depends on the game. Stochastic payoffs are less important when posted probabilities are equal to expected payoffs and more important when subjects are informed how much they would have earned from foregone strategies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.