Abstract

We are interested in the asymptotic behavior of Markov chains on the set of positive integers for which, loosely speaking, large jumps are rare and occur at a rate that behaves like a negative power of the current state, and such that small positive and negative steps of the chain roughly compensate each other. If $\mathit{X_{n}}$ is such a Markov chain started at n, we establish a limit theorem for $\frac{1}{n}\mathit{X_{n}}$ appropriately scaled in time, where the scaling limit is given by a nonnegative self-similar Markov process. We also study the asymptotic behavior of the time needed by $\mathit{X_{n}}$ to reach some fixed finite set. We identify three different regimes (roughly speaking the transient, the recurrent and the positive-recurrent regimes) in which $\mathit{X_{n}}$ exhibits different behavior. The present results extend those of Haas and Miermont [Bernoulli 17 (2011) 1217–1247] who focused on the case of nonincreasing Markov chains. We further present a number of applications to the study of Markov chains with asymptotically zero drifts such as Bessel-type random walks, nonnegative self-similar Markov processes, invariance principles for random walks conditioned to stay positive and exchangeable coalescence-fragmentation processes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call