Abstract

This chapter was a compact introduction to both discrete-time and continuous-time Markov chains. The important concepts including the definition of discrete-time and continuous-time Markov chains, Chapman-Kolmogorov equations, reachability, communication, communication classes, recurrent and transient states, period of a state for a discrete-time Markov chain, the limiting probability of a state for a discrete-time and continuous-time Markov chain, and ergodic Markov chain were explained. Several examples and problems have been solved for the discrete-time Markov chains, and where relevant, state transition diagrams and tables have been used to facilitate the comprehension of the solutions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call