Abstract

The amount of information exchanged per unit of time between two dynamic processes is an important concept for the analysis of complex systems. Theoretical formulations and data-efficient estimators have been recently introduced for this quantity, known as the mutual information rate (MIR), allowing its continuous-time computation for event-based data sets measured as realizations of coupled point processes. This work presents the implementation of MIR for point process applications in Network Physiology and cardiovascular variability, which typically feature short and noisy experimental time series. We assess the bias of MIR estimated for uncoupled point processes in the frame of surrogate data, and we compensate it by introducing a corrected MIR (cMIR) measure designed to return zero values when the two processes do not exchange information. The method is first tested extensively in synthetic point processes including a physiologically-based model of the heartbeat dynamics and the blood pressure propagation times, where we show the ability of cMIR to compensate the negative bias of MIR and return statistically significant values even for weakly coupled processes. The method is then assessed in real point-process data measured from healthy subjects during different physiological conditions, showing that cMIR between heartbeat and pressure propagation times increases significantly during postural stress, though not during mental stress. These results document that cMIR reflects physiological mechanisms of cardiovascular variability related to the joint neural autonomic modulation of heart rate and arterial compliance.

Highlights

  • The mutual information (MI) between two random variables is a central concept in information theory

  • In the reported application context where the direction of interaction is determined by the cardiac pacemaker that triggers the propagation of the sphygmic waves through the arterial bed, studying causal interactions through the transfer entropy rate (TER) is less relevant than assessing the coupling between the heartbeat and systolic times through the MI rate (MIR)

  • The adopted estimator combines the property that for point processes the MIR can be formulated in terms of the TER (Mijatovic et al, 2021a), and exploits the approach based on representing dynamic states of point processes in terms of inter-event intervals to efficiently capture information flows (Shorten et al, 2021)

Read more

Summary

INTRODUCTION

The mutual information (MI) between two random variables is a central concept in information theory. MI is an important quantity with huge practical relevance, as it quantifies how much information is exchanged between two complex systems or is shared by two data sets Thanks to these characteristics, MI is ubiquitously employed in diverse fields of science and engineering to Mutual Information Rate in Point-Processes assess linear and non-linear interactions, e.g., between electronic oscillators (Minati et al, 2018), financial systems (Fiedor, 2014), climatological variables (Perinelli et al, 2021), brain units (Mijatovic et al, 2021b) or physiological systems (Valderas et al, 2019). The novel cMIR measure is tested first in simulated point process models that reproduce the coupled occurrence of the heartbeat times and of the arrival instants of the blood pressure wave at the body periphery, and in real point process series measured from healthy subjects monitored in resting state and during postural and mental stress (Javorka et al, 2017)

INFORMATION-THEORETIC MEASURES
Mutual Information and Transfer Entropy
Computation for Bivariate Point
Practical Estimation
Corrected Measure of Mutual
SIMULATION STUDY
Simulation 1
Simulation Results
Simulation 3
APPLICATION TO REAL DATA
Database and Experimental Protocol
Results and Discussion
CONCLUDING REMARKS
DATA AVAILABILITY STATEMENT
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call