Abstract

The non-linear stochastic optimal control of quasi non-integrable Hamiltonian systems for minimizing their first-passage failure is investigated. A controlled quasi non-integrable Hamiltonian system is reduced to an one-dimensional controlled diffusion process of averaged Hamiltonian by using the stochastic averaging method for quasi non-integrable Hamiltonian systems. The dynamical programming equations and their associated boundary and final time conditions for the problems of maximization of reliability and of maximization of mean first-passage time are formulated. The optimal control law is derived from the dynamical programming equations and the control constraints. The dynamical programming equations for maximum reliability problem and for maximum mean first-passage time problem are finalized and their relationships to the backward Kolmogorov equation for the reliability function and the Pontryagin equation for mean first-passage time, respectively, are pointed out. The boundary condition at zero Hamiltonian is discussed. Two examples are worked out to illustrate the application and effectiveness of the proposed procedure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call