Abstract

Uncertainty is inherent in most real-world systems. It places many disadvantages (and sometimes, surprisingly, advantages) on humankind’s efforts, which are usually associated with the quest for optimal results. The systems mainly studied in this book are dynamic, namely, they evolve over time. Moreover, they are described by Ito’s stochastic differential equations and are sometimes called diffusion models. The basic source of uncertainty in diffusion models is white noise, which represents the joint effects of a large number of independent random forces acting on the systems. Since the systems are dynamic, the relevant decisions (controls), which are made based on the most updated information available to the decision makers (controllers), must also change over time. The decision makers must select an optimal decision among all possible ones to achieve the best expected result related to their goals. Such optimization problems are called stochastic optimal control problems. The range of stochastic optimal control problems covers a variety of physical, biological, economic, and management systems, just to mention a few. In this chapter we shall set up a rigorous mathematical framework for stochastic optimal control problems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.