Abstract

Let $\{ {x_j } \}$ and $\{ y_j \} $ be jointly stationary and ergodic sequences of random variables. Denote the information density and its expectation, the information, of $\{ x_j \} _1^n $ and $\{ y_j \} _1^n $ by $i(X_0^n ,Y_0^n )$ and $I(X_0^n ,Y_0^n )$ respectively. Denote the conditional information of $\{ x_j ,y_j \} _1^l $ and $\{ x_j ,y_j \} _{m + 1}^n $ given $\{ x_j ,y_j \} _{l + 1}^m $ by $I( {X_0^l Y_0^l ,X_m^n Y_m^n } |X_l^m Y_l^m )$. We prove that if for any $l > 0,\lim _{m,n \to \infty } I( {X_0^l Y_0^l ,X_m^n Y_m^n } |X_l^m Y_l^m ) = 0$, then $\bar I \equiv \lim _{n \to \infty } I(X_0^n ,Y_0^n )/n < \infty $ and $\lim _{n \to \infty } i(X_0^n ,Y_0^n )/n = \bar I$ ; namely, $\{ x_j \} $ and $\{ y_j \} $ have a finite information rate and are information stable. Furthermore, we extend the result to the case of continuous-parameter stochastic processes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call