Let ( X,d) be a complete separable metric space and ( F n ) n⩾0 a sequence of i.i.d. random functions from X to X which are uniform Lipschitz, that is, L n= sup x≠y d(F n(x),F n(y))/d(x,y)<∞ a.s. Providing the mean contraction assumption E log + L 1<0 and E log + d(F 1(x 0),x 0)<∞ for some x 0∈ X , it was proved by Elton (Stochast. Proc. Appl. 34 (1990) 39–47) that the forward iterations M n x = F n ∘⋯∘ F 1( x), n⩾0, converge weakly to a unique stationary distribution π for each x∈ X . The associated backward iterations M ̂ n x=F 1∘⋯∘F n(x) are a.s. convergent to a random variable M ̂ ∞ which does not depend on x and has distribution π. Based on the inequality d( M ̂ n+m x, M ̂ n x)⩽ exp(∑ k=1 n log L k)d(F n+1∘⋯∘F n+m(x),x) for all n, m⩾0 and the observation that ( ∑ k=1 n log L k) n⩾0 forms an ordinary random walk with negative drift, we will provide new estimates for d( M ̂ ∞, M ̂ n x) and d( M n x , M n y ), x,y∈ X , under polynomial as well as exponential moment conditions on log(1+ L 1) and log(1+ d( F 1( x 0), x 0)). It will particularly be shown, that the decrease of the Prokhorov distance between P n ( x,·) and π to 0 is of polynomial, respectively exponential rate under these conditions where P n denotes the n-step transition kernel of the Markov chain of forward iterations. The exponential rate was recently proved by Diaconis and Freedman (SIAM Rev. 41 (1999) 45–76) using different methods.
Read full abstract