Abstract

Summary This paper investigates asymptotic stability in probability and stabilization designs of discrete-time stochastic systems with state-dependent noise perturbations. Our work begins with a lemma on a special discrete-time stochastic system for which almost all of its sample paths starting from a nonzero initial value will never reach the origin subsequently. This motivates us to deal with the asymptotic stability in probability of discrete-time stochastic systems. A stochastic Lyapunov theorem on asymptotic stability in probability is proved by means of the convergence theorem of supermartingale. An example is given to show the difference between asymptotic stability in probability and almost surely asymptotic stability. Based on the stochastic Lyapunov theorem, the problem of asymptotic stabilization for discrete-time stochastic control systems is considered. Some sufficient conditions are proposed and applied for constructing asymptotically stable feedback controllers. Copyright © 2014 John Wiley & Sons, Ltd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.