Abstract

Information theory explains how systems encode and transmit information. This article examines the neuronal system, which processes information via neurons that react to stimuli and transmit electrical signals. Specifically, we focus on transfer entropy to measure the flow of information between sequences and explore its use in determining effective neuronal connectivity. We analyze the causal relationships between two discrete time series, X:=Xt:t∈Z and Y:=Yt:t∈Z, which take values in binary alphabets. When the bivariate process (X,Y) is a jointly stationary ergodic variable-length Markov chain with memory no larger than k, we demonstrate that the null hypothesis of the test—no causal influence—requires a zero transfer entropy rate. The plug-in estimator for this function is identified with the test statistic of the log-likelihood ratios. Since under the null hypothesis, this estimator follows an asymptotic chi-squared distribution, it facilitates the calculation of p-values when applied to empirical data. The efficacy of the hypothesis test is illustrated with data simulated from a neuronal network model, characterized by stochastic neurons with variable-length memory. The test results identify biologically relevant information, validating the underlying theory and highlighting the applicability of the method in understanding effective connectivity between neurons.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call