Abstract

We consider Hidden Markov Chains obtained by passing a Markov Chain with rare transitions through a noisy memoryless channel. We obtain asymptotic estimates for the entropy of the resulting Hidden Markov Chain as the transition rate is reduced to zero. Let (Xn) be a Markov chain with finite state space S and transition matrix P(p) and let (Yn) be the Hidden Markov chain observed by passing (Xn) through a homogeneous noisy memoryless channel (i.e. Y takes values in a set T, and there exists a matrix Q such that P(Yn = jjXn = i;X n−1 −1 ;X 1+1;Y n−1 −1 ;Y 1 n+1) = Qij). We make the additional assumption on the channel that the rows of Q are distinct. In this case we call the channel statistically distinguishing. We assume that P(p) is of the form I + pA where A is a matrix with negative entries on the diagonal, non-negative entries in the off-diagonal terms and zero row sums. We further assume that for small positive p, the Markov chain with transition matrix P(p) is irreducible. Notice that for Markov chains of this form, the invariant distribution (�i)i2 S does not depend on p. In this case, we say that for small positive values of p, the Markov chain is in a rare transition regime. We will adopt the convention that H is used to denote the entropy of a fi- nite partition, whereas h is used to denote the entropy of a process (the en- tropy rate in information theory terminology). Given an irreducible Markov chain with transition matrix P, we let h(P) be the entropy of the Markov chain (i.e. h(P) = − P i;jiPij logPij wherei is the (unique) invariant distribution of the Markov chain and as usual we adopt the convention that 0log0 = 0). We also let Hchan(i) be the entropy of the output of the channel when the input symbol is i (i.e. Hchan(i) = − P j2 T Qij logQij). Let h(Y ) denote the entropy of Y (i.e.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call