In this paper a theory of optimal control is developed for stochastic systems whose performance is measured by the exponential of an integral form. Such a formulation of the cost function is shown to be not only general and useful but also analytically tractable. Starting with very general classes of stochastic systems, optimality conditions are obtained which exploit the multiplicative decomposability of the exponential-of-integral form. Specializing to partially observed systems of stochastic differential equations with Brownian Motion disturbances, optimality conditions are obtained which parallel those for systems with integral costs. Also treated are the special cases of linear systems with exponential of quadratic costs for which explicit optimal controls are obtainable. In addition, several general results of independent interest are obtained, which concern optimality of stochastic systems.
Read full abstract