Abstract

The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.

Highlights

  • Shannon’s entropy quantifies information [1]

  • In previous works of Refs. [14,15], we have proposed an upper bound for the mutual information rate (MIR) in terms of the positive Lyapunov exponents of the synchronisation manifold

  • Methods) is to show that, in dynamical networks or data sets with fast decay of correlation, IS in Eq (1) represents the amount of mutual information between X and Y produced within a special time interval T, where T represents the time for the dynamical network to lose its memory from the initial state or the correlation to decay to zero

Read more

Summary

Introduction

Shannon’s entropy quantifies information [1] It measures how much uncertainty an observer has about an event being produced by a random system. Mutual information (MI) is an important quantity because it quantifies linear and non-linear interdependencies between two systems or data sets, and is a measure of how much information two systems exchange or two data sets share Due to these characteristics, it became a fundamental quantity to understand the development and function of the brain [2,3], to characterise [4,5] and model complex systems [6,7,8] or chaotic systems, and to quantify the information capacity of a communication system [9]. Mutual information provides a way to identify those variables [10]

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.