Abstract

Within a general abstract framework, we show that any optimal control problem in standard form can be translated into a stochastic target problem as defined in Soner and Touzi (2002) [5], whenever the underlying filtered probability space admits a suitable martingale representation property. This provides a unified way of treating these two classes of stochastic control problems. As an illustration, we show, within a jump diffusion framework, how the Hamilton–Jacobi–Bellman equations associated to an optimal control problem in standard form can be easily retrieved from the partial differential equations associated to its stochastic target counterpart.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call