Abstract

Problems of optimal control of linear and nonlinear stochastic systems with quadratic criterion qualities are studied. For such problems the existence of optimal control in feedback control form is proved by method of dynamic programming. The work consists of two parts. The first part deals with the linear problem. In the first part, the existence of a solution to the Cauchy problem for the generalized Riccati equation is proved by a method based on the idea of the Bellman linearization scheme. The proof consists in the direct application of existence theorems for ordinary differential equations to the generalized Riccati equation. The main part of the article is its second part, which concerns the study of a nonlinear problem. Meaningful result is obtained only when the control is included with a small parameter in the stochastic part. Existence for small $$\varepsilon$$ of the solution of the Bellman equation corresponding to the nonlinear problem are proved using the abstract implicit function theorem.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.