Abstract

A method is shown for computing transfer entropy over multiple time lags for coupled autoregressive processes using formulas for the differential entropy of multivariate Gaussian processes. Two examples are provided: (1) a first-order filtered noise process whose state is measured with additive noise, and (2) two first-order coupled processes each of which is driven by white process noise. We found that, for the first example, increasing the first-order AR coefficient while keeping the correlation coefficient between filtered and measured process fixed, transfer entropy increased since the entropy of the measured process was itself increased. For the second example, the minimum correlation coefficient occurs when the process noise variances match. It was seen that matching of these variances results in minimum information flow, expressed as the sum of transfer entropies in both directions. Without a match, the transfer entropy is larger in the direction away from the process having the larger process noise. Fixing the process noise variances, transfer entropies in both directions increase with the coupling strength. Finally, we note that the method can be generally employed to compute other information theoretic quantities as well.

Highlights

  • Transfer entropy [1] quantifies the information flow between two processes

  • Information is defined to be flowing from system X to system Y whenever knowing the past states of X reduces the uncertainty of one or more of the current states of Y above and beyond what uncertainty reduction is achieved by only knowing the past Y states

  • Transfer entropy is the mutual information between the current state of system Y and one or more past states of system X, conditioned on one or more past states of system Y

Read more

Summary

Introduction

Transfer entropy [1] quantifies the information flow between two processes. Information is defined to be flowing from system X to system Y whenever knowing the past states of X reduces the uncertainty of one or more of the current states of Y above and beyond what uncertainty reduction is achieved by only knowing the past Y states. Each of the two transfer entropy values TEx→y and TEy→x is nonnegative and both will be positive (and not necessarily equal) when information flow is bi-directional Because of these properties, transfer entropy is useful for detecting causal relationships between systems generating measurement time series. In what follows we first show how to compute the covariance matrix for successive iterates of the example AR processes and use these matrices to compute transfer entropy quantities based on the differential entropy expression for multivariate Gaussian random variables. Note that Kaiser and Schreiber [8] have previously shown how to compute information transfer metrics for continuous-time processes In their paper they provide an explicit example, computing transfer entropy for two linear stochastic processes where one of the processes is autonomous and the other is coupled to it. We provide a discussion of differential entropy, the formulation of entropy appropriate to continuousvalued processes as we are considering

Differential Entropy
H X pi log pi i f i x log f i x i f i x log f i log x i
C XY cov 11
Transfer Entropy Computation Using Variable Number of Timestamps
E Su Su SE u u S
C E z z E D z DSu Su D z
Example 1: A One-Way Coupled System
Example 2
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.