Abstract

The paper gives an overview of stochastic optimal control theory and its applications to operational research. Stochastic control theory deals with the intertemporal optimization of dynamic systems under uncertainty. After a short review of deterministic optimal control theory and the theory of stochastic dynamic systems, a general stochastic optimal control problem is formulated both in discrete and continuous time. Among the method for solving such problems, particularly stochastic dynamic programming and the stochastic maximum principle are introduced. Examples from the areas of finance, advertising, production-inventory models, and information policy illustrate the applicability of stochastic control theory to operational research problems. Moreover, its actual and potential impact on OR is discussed from a methodological point of view.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call