Abstract

Chance theory is a useful tool to deal with the analysis of indeterminacy, including both uncertainty and randomness. Based on chance theory, an optimal control for uncertain random continuous-time systems is introduced to design dynamic optimization problems. Applying the method of dynamic programming, the principle of optimality is presented and then the equation of optimality is provided to solve the proposed problem. Meanwhile, three special cases of optimal control problems are discussed by using the equation obtained. Finally, a numerical example and an optimal cash balance problem are given to show the effectiveness of the results achieved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call