Abstract

The brain works as a dynamic system to process information. Various challenges remain in understanding the connection between information and dynamics attributes in the brain. The present research pursues exploring how the characteristics of neural information functions are linked to neural dynamics. We attempt to bridge dynamics (e.g., Kolmogorov-Sinai entropy) and information (e.g., mutual information and Fisher information) metrics on the stimulus-triggered stochastic dynamics in neural populations. On the one hand, our unified analysis identifies various essential features of the information-processing-related neural dynamics. We discover spatiotemporal differences in the dynamic randomness and chaotic degrees of neural dynamics during neural information processing. On the other hand, our framework reveals the fundamental role of neural dynamics in shaping neural information processing. The neural dynamics creates an oppositely directed variation of encoding and decoding properties under specific conditions, and it determines the neural representation of stimulus distribution. Overall, our findings demonstrate a potential direction to explain the emergence of neural information processing from neural dynamics and help understand the intrinsic connections between the informational and the physical brain.

Highlights

  • Understanding how the brain works lies at the frontier of the intersection of biology and physics [1]

  • After reviewing the quantification of neural information function attributes, we explore the substantive characteristics of information-processing-related neural dynamics and analyze the emergence of neural information function attributes from neural dynamics

  • Based on the illustrated instances and statistical results, we find that the dynamic randomness measured by HKS of each neuron and the diversity of dynamic randomness between neurons are negatively correlated with τ ∈ [1, 50] (the diversity of dynamic randomness is quantified by the variance Var(HKS ) among neurons)

Read more

Summary

INTRODUCTION

Understanding how the brain works lies at the frontier of the intersection of biology and physics [1]. A dilemma exists that the stochastic models [29,30,31,32] surpass deterministic models [33,34,35,36,37] in supporting information-theoretical metrics, but these stochastic approaches are weak in defining the dynamics related to interneuron interactions and neural tuning properties (i.e., the response selectivity to stimuli) Another challenge arises from the lack of an applicable metric of the neural dynamics involved in information processing. The significance of our pursuit lies in the possibility for the analysis to explore the fundamental connections between the physical (dynamics aspect) and informational (information aspect) brain [1]. While we concentrate on physical pictures and neuroscience backgrounds throughout the paper, one can find the systematic description of all mathematical implementations in the Appendixes

Neural population description
Neural activities of input neurons
Neural activities of intermediary neurons
Neural tuning Kolmogorov-Sinai entropy
Chaos in neural activities
Information-theoretical metrics reformulation
Finding 1
Finding 2
Finding 3
Finding 4
Significance of our work
Validity and limitations
Future directions
Characterize neural populations
The neural activity of input neuron
The neural activity of intermediary neuron
Neural activities follow the nonhomogeneous continuous Markov chain
Defining Kolmogorov-Sinai entropy depending on neural tuning properties
Chaos of neural activities
Properties of neural encoding
Properties of neural decoding
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call