Abstract

Finite-time stability involves dynamical systems whose trajectories converge to an equilibrium state in finite time. Sufficient conditions for finite-time stability have recently been developed in the literature for discrete-time dynamical systems. In this article, we build on these results to develop a framework for addressing the problem of optimal nonlinear analysis and feedback control for finite-time stability and finite-time stabilization for nonlinear discrete-time controlled dynamical systems. Finite-time stability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function that satisfies a difference inequality involving fractional powers and a minimum operator. This Lyapunov function can clearly be seen to be the solution to a difference equation that corresponds to a steady-state form of the Bellman equation, and hence, guaranteeing both finite-time stability and optimality. Finally, a numerical example is presented to demonstrate the efficacy of the proposed finite-time discrete stabilization framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call