Abstract

Statistical physics determines the abundance of different arrangements of matter depending on cost-benefit balances. Its formalism and phenomenology percolate throughout biological processes and set limits to effective computation. Under specific conditions, self-replicating and computationally complex patterns become favored, yielding life, cognition, and Darwinian evolution. Neurons and neural circuits sit at a crossroads between statistical physics, computation, and (through their role in cognition) natural selection. Can we establish a statistical physics of neural circuits? Such theory would tell what kinds of brains to expect under set energetic, evolutionary, and computational conditions. With this big picture in mind, we focus on the fate of duplicated neural circuits. We look at examples from central nervous systems, with stress on computational thresholds that might prompt this redundancy. We also study a naive cost-benefit balance for duplicated circuits implementing complex phenotypes. From this, we derive phase diagrams and (phase-like) transitions between single and duplicated circuits, which constrain evolutionary paths to complex cognition. Back to the big picture, similar phase diagrams and transitions might constrain I/O and internal connectivity patterns of neural circuits at large. The formalism of statistical physics seems to be a natural framework for this worthy line of research.

Highlights

  • Statistical physics determines the abundance of different patterns of matter according to cost–benefit calculations

  • Entropy 2020, 22, 928 layer of complexity, statistical physics remains a very apt language to describe some biological processes [6,7,8,9,10]: cell cycles can be studied as thermodynamic cycles [11,12,13], aspects of organisms might stem from an effective free energy minimization [14], and variation along relevant dimensions determines the viability of key living structures, which can be abruptly terminated as in phase transitions [15,16]

  • We focus on much narrower questions: As we wonder in the title, what is the evolutionary fate of duplicated neural circuits, as they confront fixed thermodynamic and computational conditions? Given some metabolic constraints and information-processing needs, is a duplicated neural structure stable? Or is it so redundant that it becomes energetically unjustified?

Read more

Summary

Introduction

Statistical physics determines the abundance of different patterns of matter according to cost–benefit calculations. Entropy 2020, 22, 928 layer of complexity, statistical physics remains a very apt language to describe some biological processes [6,7,8,9,10]: cell cycles can be studied as thermodynamic cycles [11,12,13], aspects of organisms might stem from an effective free energy minimization (i.e., yet another cost-benefit balance) [14], and variation along relevant dimensions (e.g., organism size vs metabolic load) determines the viability of key living structures, which can be abruptly terminated as in phase transitions [15,16]. Within the previous framework we wonder how to elaborate a biologically grounded statistical physics of neural circuits (Figure 1c) This should bring known thermodynamic aspects of computation into a Darwinian framework where cognition serves biological function [29,44,45,46,47,48] (eventually, to balance metabolic costs and extract free energy from an environment for an organism’s advantage).

The Two Hemispheres
The Perisylvan Network for Human Language
Internalizing the Control of Movement
Place and Grid Cells—A Twofold Representation of Space?
Somatosensory and Motor Cortices
Reactive versus Predictive Brains
The Cortical Column
Simple Models for a Complex Research Line
A Naive Cost-Efficiency Model of Duplicated Circuitry for Complex Tasks
The Garden of Forking Neural Structures
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call