Abstract

The classic arcsine law for the number Nn>≔n−1∑k=1n1{Sk>0} of positive terms, as n→∞, in an ordinary random walk (Sn)n≥0 is extended to the case when this random walk is governed by a positive recurrent Markov chain (Mn)n≥0 on a countable state space S, that is, for a Markov random walk (Mn,Sn)n≥0 with positive recurrent discrete driving chain. More precisely, it is shown that n−1Nn> converges in distribution to a generalized arcsine law with parameter ρ∈[0,1] (the classic arcsine law if ρ=1∕2) iff the Spitzer condition limn→∞1n∑k=1nPi(Sn>0)=ρholds true for some and then all i∈S, where Pi≔P(⋅|M0=i) for i∈S. It is also proved, under an extra assumption on the driving chain if 0<ρ<1, that this condition is equivalent to the stronger variant limn→∞Pi(Sn>0)=ρ.For an ordinary random walk, this was shown by Doney (1995) for 0<ρ<1 and by Bertoin and Doney (1997) for ρ∈{0,1}.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call