Abstract

Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons.

Highlights

  • In the field of complexity and complex systems research there is a growing interest in the use of information theory (IT) as a principal tool in the development of new theories [1,2,3,4,5,6,7,8,9,10,11]

  • Applications of information theory to complex systems research range from the pragmatic viewpoint, such as using the mutual information function as a non-linear correlation measure, to the more fundamental viewpoint intended here, where each dynamical system is interpreted as a (Turing) computation, consisting of the storage, transfer, and modification of information

  • We focus on pairwise mutual information because many derived information-theoretical measures can be written as a sum of pairwise terms

Read more

Summary

Introduction

In the field of complexity and complex systems research there is a growing interest in the use of information theory (IT) as a principal tool in the development of new theories [1,2,3,4,5,6,7,8,9,10,11]. Applications of information theory to complex systems research range from the pragmatic viewpoint, such as using the mutual information function as a non-linear correlation measure, to the more fundamental viewpoint intended here, where each dynamical system is interpreted as a (Turing) computation, consisting of the storage, transfer, and modification of information. Informational bits are considered to be physically stored in the states of dynamical components (neurons) at one instant. The basic building blocks of such theoretical work are Shannon’s entropy and mutual information functions. Based on these measures a wide range of information-theoretical measures have been derived, such as information redundancy, multi-information, information synergy [9,10,18], and (localized) Transfer

Objectives
Methods
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.