Abstract

Statistical dependency between neuronal spike trains forms a basis for information encoding in memory and learning. The said dependency could also be a key to detecting pathologies. For instance, Alzheimer’s disease has been associated with hyper-synchronicity, which indicates abnormally high statistical dependency. To study such dependency in depth, one requires its accurate quantification within short intervals in view of inherent time variations. However, existing Lempel-Ziv-based schemes tend to convergence slowly. In response, we propose its quantification based on empirical mutual information rate, which is shown to converge satisfactorily fast. In particular, fast convergence was demonstrated for simulated Markov processes as well as experimentally observed neuronal spike trains. Further, heterogeneity was observed in mutual information rate estimates and in process memory structures even within a few neuron pairs. In a large-scale study, more useful patterns will likely be observed, which can potentially emerge as disease signatures. The proposed statistical dependency estimator will also allow studies relating neuronal organization in physical networks and patterns of information processing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call