Abstract

Finite-time stability involves dynamical systems whose trajectories converge to an equilibrium state in finite time. Sufficient conditions for finite-time stability have recently been developed in the literature for discrete-time dynamical systems. In this article, we build on these results to develop a framework for addressing the problem of optimal nonlinear analysis and feedback control for finite-time stability and finite-time stabilization for nonlinear discrete-time controlled dynamical systems. Finite-time stability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function that satisfies a difference inequality involving fractional powers and a minimum operator. This Lyapunov function can clearly be seen to be the solution to a difference equation that corresponds to a steady-state form of the Bellman equation, and hence, guaranteeing both finite-time stability and optimality. Finally, a numerical example is presented to demonstrate the efficacy of the proposed finite-time discrete stabilization framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.