Abstract

Introduced by Shannon as a "rate of actual transmission," mutual information rate (MIR) is an extension of mutual information to a pair of dynamical processes. We show a delay-independence theorem, according to which MIR is not sensitive to a time shift between the two processes. Numerical studies of several benchmark situations confirm that this theoretical asymptotic property remains valid for realistic finite sequences. Estimations based on block entropies and a causal state machine algorithm perform better than an estimation based on a Lempel-Ziv compression algorithm provided that block length and maximum history length, respectively, can be chosen larger than the delay. MIR is thus a relevant index for measuring nonlinear correlations between two experimental or simulated sequences when the transmission delay (in input-output devices) or dephasing (in coupled systems) is variable or unknown.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.