Abstract

This chapter discusses the optimal control of stochastic systems. Every known deterministic mathematical model can be considered as simplification of a suitable stochastic model. Thus, there are random algebraic equations, stochastic differential equations, stochastic integral equations, and, more generally, stochastic functional equations. The most commonly used model in the study of optimal control theory is the Ito stochastic differential or functional differential equation with Poisson random measure omitted. Stochastic differential equations modeled by ordinary differential equations containing Markovian jump parameters have been used in control theory. Existence theorems for optimal controls of deterministic differential and functional differential equations are well developed. One of the most fascinating and challenging problems in control theory presently is the proof of the existence of optimal controls, especially feedback controls, that is, controls dependent on the state of the system. The chapter considers open loop controls and discusses some of the recent existence theorems for systems governed by stochastic differential and functional differential equations, including relaxed stochastic controls.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call