Abstract

Recently P. L. Lions has demonstrated the connection between the value function of stochastic optimal control and a viscosity solution of Hamilton-Jacobi-Bellman equation [cf. 10, 11, 12]. The purpose of this paper is to extend partially his results to stochastic differential games, where two players conflict each other. If the value function of stochatic differential game is smooth enough, then it satisfies a second order partial differential equation with max-min or min-max type nonlinearity, called Isaacs equation [cf. 5]. Since we can write a nonlinear function as min-max of appropriate affine functions, under some mild conditions, the stochastic differential game theory provides some convenient representation formulas for solutions of nonlinear partial differential equations [cf. 1, 2, 3].

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call