Abstract

The continuously growing framework of information dynamics encompasses a set of tools, rooted in information theory and statistical physics, which allow to quantify different aspects of the statistical structure of multivariate processes reflecting the temporal dynamics of complex networks. Building on the most recent developments in this field, this work designs a complete approach to dissect the information carried by the target of a network of multiple interacting systems into the new information produced by the system, the information stored in the system, and the information transferred to it from the other systems; information storage and transfer are then further decomposed into amounts eliciting the specific contribution of assigned source systems to the target dynamics, and amounts reflecting information modification through the balance between redundant and synergetic interaction between systems. These decompositions are formulated quantifying information either as the variance or as the entropy of the investigated processes, and their exact computation for the case of linear Gaussian processes is presented. The theoretical properties of the resulting measures are first investigated in simulations of vector autoregressive processes. Then, the measures are applied to assess information dynamics in cardiovascular networks from the variability series of heart period, systolic arterial pressure and respiratory activity measured in healthy subjects during supine rest, orthostatic stress, and mental stress. Our results document the importance of combining the assessment of information storage, transfer and modification to investigate common and complementary aspects of network dynamics; suggest the higher specificity to alterations in the network properties of the measures derived from the decompositions; and indicate that measures of information transfer and information modification are better assessed, respectively, through entropy-based and variance-based implementations of the framework.

Highlights

  • The framework of information dynamics is rapidly emerging, at the forefront between the theoretical fields of information theory and statistical physics and applicative fields such as neuroscience and physiology, as a versatile and unifying set of tools that allow to dissect the general concept of “information processing” in a network of interacting dynamical systems into basic elements of computation reflecting different aspects of the functional organization of the network [1,2,3].Within this framework, several tools that include the concept of temporal precedence within the computation of standard information-theoretic measures have been proposed to provide a quantitative description of how collective behaviors in multivariate systems arise from the interaction between the individual system components

  • This work provides an exhaustive framework to dissect the information carried by the target of a network of interacting dynamical systems in atoms of information that form the building blocks of traditional measures of information dynamics such as predictive information, information storage and information transfer. These basic elements are useful to elucidate the specific contributions of individual systems in the network to the dynamics of the target system, as well as to describe the balance of redundancy and synergy between the sources while they contribute to the information stored in the target and to the information transferred to it

  • Formulating exact values of these measures for the case of Gaussian systems, our theoretical and real-data results illustrate how information storage, transfer and modification interact with each other to give rise to the predictive information of a target dynamical system connected to multiple source systems

Read more

Summary

Introduction

The framework of information dynamics is rapidly emerging, at the forefront between the theoretical fields of information theory and statistical physics and applicative fields such as neuroscience and physiology, as a versatile and unifying set of tools that allow to dissect the general concept of “information processing” in a network of interacting dynamical systems into basic elements of computation reflecting different aspects of the functional organization of the network [1,2,3] Within this framework, several tools that include the concept of temporal precedence within the computation of standard information-theoretic measures have been proposed to provide a quantitative description of how collective behaviors in multivariate systems arise from the interaction between the individual system components. Recent studies have implemented these measures in cardiovascular physiology to study the short-term dynamics of the cardiac, vascular and respiratory systems in terms of information storage, transfer and modification [12,13,29]

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call