Abstract

A stability analysis of stochastic systems which may be Markovinized under feedback is presented. Two different formulations of stability are introduced along with necessary and sufficient conditions for stability. If the closed loop system satisfies a condition known as local stochastic controllability, it is shown that these notions of stability are equivalent. Under the local stochastic controllability assumption a variety of results are presented. It is shown that such systems exhibit very regular asymptotic behavior. For example, the convergence of averages of functions of the state process converge for every initial condition for stable locally stochastically controllable systems. Furthermore, if there is exactly one invariant probability, then the probabilities governing the state process converge to a periodic orbit consisting of weighted averages of restrictions of the invariant probability to cyclical sets. The results are applied to the stability analysis of a random parameter system operating under feedback.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call