Abstract

Abstract We study optimal control problems where the dynamic system evolves according to a linear stochastic differential equation with (a) multiplicative and additive white noise, (b) a mixture of white noise, and (c) white noise and colored noise. The drift rate and the diffusion matrix of the linear dynamic system depend on a continuous-time Markov chain with finite state space that is (I) partially observed and (II) completely observed. Using the results of Wonham filter theory, we reduce the partially observed problems to one with complete observation. We solve the control optimal problems explicitly with the help of dynamic programing technique, and three applications are presented to illustrate our theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call