Abstract

Neural systems process information in a dynamical regime between silence and chaotic dynamics. This has lead to the criticality hypothesis, which suggests that neural systems reach such a state by self-organizing toward the critical point of a dynamical phase transition. Here, we study a minimal neural network model that exhibits self-organized criticality in the presence of stochastic noise using a rewiring rule which only utilizes local information. For network evolution, incoming links are added to a node or deleted, depending on the node's average activity. Based on this rewiring-rule only, the network evolves toward a critical state, showing typical power-law-distributed avalanche statistics. The observed exponents are in accord with criticality as predicted by dynamical scaling theory, as well as with the observed exponents of neural avalanches. The critical state of the model is reached autonomously without the need for parameter tuning, is independent of initial conditions, is robust under stochastic noise, and independent of details of the implementation as different variants of the model indicate. We argue that this supports the hypothesis that real neural systems may utilize such a mechanism to self-organize toward criticality, especially during early developmental stages.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call