Abstract

Information transfer, measured by transfer entropy, is a key component of distributed computation. It is therefore important to understand the pattern of information transfer in order to unravel the distributed computational algorithms of a system. Since in many natural systems distributed computation is thought to rely on rhythmic processes a frequency resolved measure of information transfer is highly desirable. Here, we present a novel algorithm, and its efficient implementation, to identify separately frequencies sending and receiving information in a network. Our approach relies on the invertible maximum overlap discrete wavelet transform (MODWT) for the creation of surrogate data in the computation of transfer entropy and entirely avoids filtering of the original signals. The approach thereby avoids well-known problems due to phase shifts or the ineffectiveness of filtering in the information theoretic setting. We also show that measuring frequency-resolved information transfer is a partial information decomposition problem that cannot be fully resolved to date and discuss the implications of this issue. Last, we evaluate the performance of our algorithm on simulated data and apply it to human magnetoencephalography (MEG) recordings and to local field potential recordings in the ferret. In human MEG we demonstrate top-down information flow in temporal cortex from very high frequencies (above 100Hz) to both similarly high frequencies and to frequencies around 20Hz, i.e. a complex spectral configuration of cortical information transmission that has not been described before. In the ferret we show that the prefrontal cortex sends information at low frequencies (4-8 Hz) to early visual cortex (V1), while V1 receives the information at high frequencies (> 125 Hz).

Highlights

  • Many natural or artificial complex systems perform distributed computation

  • Since many rhythms can interact at both sender and receiver side, we show that the above problem is tightly linked to partial information decomposition—an intriguing problem from information theory only solved recently, and only partly

  • We present an algorithm to measure the information transfer that is associated with specific spectral components in an information source or target

Read more

Summary

Introduction

Many natural or artificial complex systems perform distributed computation. In a distributed computation multiple relatively simple parts of the system perform rather elementary operations on their inputs, but do communicate heavily amongst each other in order to jointly implement complex computations. Many systems display highly rhythmic activity when performing distributed computation, suggesting that measuring the information transfer associated with different spectral components may provide valuable additional insights. For the algorithms presented here we explicitly assume that the multivariate network identification problem has been solved, i.e. that the information transfer between a source and a target, conditional on the relevant rest of the network, is genuine. By this we mean that the information flows directly from source to target, and does not flow via any intermediate node that we have data from. This setting can be achieved by computing either multivariate transfer entropies directly (see section), possibly via some greedy approximation [11, 12], or via a computation of bi-variate transfer entropies in combination with another approximate correction method [10]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call