Abstract

Markov chains with a countably infinite state space exhibit some types of behavior not possible for chains with a finite state space. Figure 5.1 helps explain how these new types of behavior arise. If p > 1/2, then transitions to the right occur with higher frequency than transitions to the left. Thus, reasoning heuristically, we expect Xn to be large for large n. This means that, given X0 = 0, the probability P 0j n should go to zero for any fixed j with increasing n. If one tried to define the steady state probability of state j as limn→∞P 0j n ,then this limit would be 0 for all j. These probabilities would not sum to 1, and thus would not correspond to a limiting distribution. Thus we say that a steady state does not exist. In more heuristic terms, the state keeps increasing forever. The truncation of figure 5.1 to k states is analyzed in exercise 4.3. The solution there defines ρ=p/q and shows that π1=(1−ρ)ρ1/(1−ρk) for ρ≠1 and π1 = 1/k for ρ=1. For ρ<1 the limiting behavior as k → ∞ is π1= (1−ρ)ρi for ρ<1 and πi=0 otherwise. In section 5.3 we analyze birth death Markov chains, of which figure 5.1 is an example, without first truncating the chain.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.