Abstract

The problem of multivariate information analysis is considered. First, the interaction information in each dimension is defined analogously according to McGill [4] and then applied to Markov chains. The property of interaction information zero deeply relates to a certain class of weakly dependent random variables. For homogeneous, recurrent Markov chains with m states, m ≥ n ≥3, the zero criterion of n-dimensional interaction information is achieved only by ( n − 2)-dependent Markov chains, which are generated by some nilpotent matrices. Further for Gaussian Markov chains, it gives the decomposition rule of the variables into mutually correlated subchains.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call