Abstract
Some of the basic concepts of information theory are critically reviewed in the light of a generalized formulation of the theory of Markoff's chains, in which the initial and final states are sequences of symbols of different lengths, and occurrence of symbols is governed by inter-symbol correlation probability of finite range. In particular, the conditions of ergodicity and the structure of ergodic subsets of sequences of arbitrary length are carefully discussed. A mathematical method is developed to determine the range and strength of inter-symbol correlation. A brief summary of the content is given at the end of Section 1.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Transactions of the IRE Professional Group on Information Theory
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.