A procedure for designing a feedback control to asymptotically stabilize, with probability one, a quasi nonintegrable Hamiltonian system is proposed. First, the motion equations of a system are reduced to a one-dimensional averaged Ito stochastic differential equation for controlled Hamiltonian by using the stochastic averaging method for quasi nonintegrable Hamiltonian systems. Second, a dynamical programming equation for the ergodic control problem of the averaged system with undetermined cost function is established based on the dynamical programming principle. This equation is then solved to yield the optimal control law. Third, a formula for the Lyapunov exponent of the completely averaged Ito equation is derived by introducing a new norm for the definitions of stochastic stability and Lyapunov exponent in terms of the square root of Hamiltonian. The asymptotic stability with probability one of the originally controlled system is analysed approximately by using the Lyapunov exponent. Finally, the cost function is determined by the requirement of stabilizing the system. Two examples are given to illustrate the application of the proposed procedure and the effectiveness of control on stabilizing the system.
Read full abstract