Abstract

A procedure for designing feedback control to asymptotically stabilize, with probability one, quasi-integrable Hamiltonian systems with bounded uncertain parametric disturbances is proposed. First, the partially averaged Itô stochastic differential equations are derived from given system by using the stochastic averaging method for quasi-integrable Hamiltonian systems. Second, the Hamilton–Jacobi–Issacs (HJI) equation for the ergodic control problem of the averaged system and a performance index with undetermined cost function is established based on the principle of optimality. This equation is then solved to yield the worst disturbances and the associated optimal controls. Third, the asymptotic Lyapunov stability with probability one of the optimally controlled system with worst disturbances is analyzed by evaluating the maximal Lyapunov exponent of the fully averaged Itô equations. Finally, the cost function and feedback control are determined by the requirement of stabilizing the worst-disturbed system. A simple example is worked out to illustrate the application of the proposed procedure and the effects of optimal control on stabilizing the uncertain system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call