Abstract

We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect.

Highlights

  • Transfer entropy has been introduced as an information-theoretic measure that quantifies the statistical coherence between systems evolving in time [1]

  • For instance, the source Y is such that the system X is independent of it, there is no difference in the extents of disturbances to the equilibrium, and the transfer entropy is zero

  • In this paper we proposed a thermodynamic interpretation of transfer entropy: an information-theoretic measure introduced by Schreiber [1] as the average information contained in the source about the state of the destination in the context of what was already contained in the destination’s past

Read more

Summary

Introduction

Transfer entropy has been introduced as an information-theoretic measure that quantifies the statistical coherence between systems evolving in time [1]. This task is not trivial, and needs to be approached carefully Another contribution of this paper is a clarification that similar thermodynamic treatment is not applicable to information flow—a measure introduced by Ay and Polani [18] in order to capture causal effect. This allows us to define components of transfer entropy with the entropy rate of (i) the resultant transition and (ii) the internal entropy production.

Transfer Entropy
Local Transfer Entropy
Causal Effect as Information Flow
Local Information Flow
System Definition
Entropy Definitions
Transition Probabilities
Entropy Production
Range of Applicability
An Example
Transitions Near Equilibrium
Transfer Entropy as Entropy Production
Transfer Entropy as a Measure of Equilibrium’s Stability
Heat Transfer
Causal Effect
Discussion and Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.