Abstract

This paper concerns an optimal control problem on the space of probability measures over a compact Riemannian manifold. The motivation behind it is to model certain situations where the central planner of a deterministic controlled system has only a probabilistic knowledge of the initial condition. The lack of information here is very specific. In particular, we show that the value function verifies a dynamic programming principle and we prove that it is the unique viscosity solution to a suitable Hamilton Jacobi Bellman equation. The notion of viscosity is defined using test functions that are directionally differentiable in the space of probability measures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call