Abstract

We study the asymptotics of a class of two-player, zero-sum stochastic game with incomplete information on one side when the time span between two consecutive stages vanishes. The informed player observes the realization of a Markov chain on which the payoffs depend, whereas the noninformed player only observes his opponent’s actions. We show the existence of a limit value; this value is characterized through an auxiliary optimization problem and as the solution of a Hamilton-Jacobi equation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call