Abstract

The information processing in the large scale network of the human brain is related to its cognitive functions. Due to requirements for adaptation to changing environments under biological constraints, these processes in the brain can be hypothesized to be optimized. The principles based on the information optimization are expected to play a central role in affecting the dynamics and topological structure of the brain network. Recent studies on the functional connectivity between brain regions, referred to as the functional connectome, reveal characteristics of their networks, such as self-organized criticality of brain dynamics and small-world topology. However, these important attributes are established separately, and their relations to the principle of the information optimization are unclear. Here, we show that the maximization principle of the mutual information entropy induces the optimal state, at which the small-world network topology and the criticality in the activation dynamics emerge. Our findings, based on the functional connectome analyses, show that according to the increasing mutual information entropy, the coactivation pattern converges to the state of self-organized criticality, and a phase transition of the network topology, which is responsible for the small-world topology, arises simultaneously at the same point. The coincidence of these phase transitions at the same critical point indicates that the criticality of the dynamics and the phase transition of the network topology are essentially rooted in the same phenomenon driven by the mutual information maximization. As a consequence, the two different attributes of the brain, self-organized criticality and small-world topology, can be understood within a unified perspective under the information-based principle. Thus, our study provides an insight into the mechanism underlying the information processing in the brain.

Highlights

  • The human brain maintains its performance during perception, cognition, and behavior through information processing in the neuronal networks (Linsker, 1988; Gray et al, 1989; Sporns, 2002; Womelsdorf et al, 2007)

  • We show a direct evidence that small-world network topology and self-organized criticality are related by the maximization principle of the mutual information entropy

  • We showed that network topology is one of the factors which contribute toward maximization of the mutual information entropy, and this is accompanied by its phase transition

Read more

Summary

Introduction

The human brain maintains its performance during perception, cognition, and behavior through information processing in the neuronal networks (Linsker, 1988; Gray et al, 1989; Sporns, 2002; Womelsdorf et al, 2007). Since the brain is spatially limited in its finite volume, it is natural to assume that physical constraints, such as the biological costs, require the brain to optimize its function based on the limited resources (Achard and Bullmore, 2006; Chen et al, 2006; Bassett et al, 2010; Bullmore and Sporns, 2012) Due to this issue, the principles based on the information theoretic quantities, such as free-energy (Friston, 2010) and mutual information (Linsker, 1990), provide formulations, which account for the mechanism underlying the function and structure of the brain. Understanding the details of the mechanism, and the effect of these principles on structural and functional aspects of the brain networks remains an open issue

Methods
Findings
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call