Abstract
We study new formulae based on Lya- punov exponents for entropy, mutual information, and capacity of finite state discrete time Markov chan- nels. We also develop a method for directly com- puting mutual information and entropy using contin- uous state space Markov chains. We show that the entropy rate for a symbol sequence is equal to the primary Lyapunov exponent for a product of random matrices. We then develop a continuous state space Markov chain formulation that allows us to directly compute entropy rates as expectations with respect to the Markov chain's stationary distribution. We also show that the stationary distribution is a continuous function of the input symbol dynamics. This continu- ity allows the channel capacity to be written in terms of Lyapunov exponents.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.