Abstract

This paper studies the entropy and filtering of hidden Markov processes (HMPs) which are the observation of a binary homogeneous Markov chain through an arbitrary memoryless channel. A fixed-point functional equation is derived for the stationary distribution of an input symbol conditioned on all past observations. The entropy or differential entropy rate of the HMP can then be computed in two ways: one by exploiting the average entropy of each input symbol conditioned on past observations, and the other by applying a differential relationship between the input-output mutual information and the stationary distribution obtained via filtering. While the existence of a solution to the fixed-point equation is guaranteed by martingale theory, its uniqueness follows from the fact that the solution is the fixed point of a contraction mapping. Due to lack of an analytical solution to the fixed-point equation, a numerical method is proposed in which the fixed-point functional equation is first converted to a discrete linear system using uniform quantization and then solved using quadratic programming. Two examples, which the numerical method is applied to the binary symmetric channel (BSC) and additive white Gaussian noise (AWGN) channel, are presented. Unlike many other numerical methods, this numerical solution is not based on averaging over a long sample path of the HMP.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call