Abstract

SummaryInformation processing in the brain depends on the integration of synaptic input distributed throughout neuronal dendrites. Dendritic integration is a hierarchical process, proposed to be equivalent to integration by a multilayer network, potentially endowing single neurons with substantial computational power. However, whether neurons can learn to harness dendritic properties to realize this potential is unknown. Here, we develop a learning rule from dendritic cable theory and use it to investigate the processing capacity of a detailed pyramidal neuron model. We show that computations using spatial or temporal features of synaptic input patterns can be learned, and even synergistically combined, to solve a canonical nonlinear feature-binding problem. The voltage dependence of the learning rule drives coactive synapses to engage dendritic nonlinearities, whereas spike-timing dependence shapes the time course of subthreshold potentials. Dendritic input-output relationships can therefore be flexibly tuned through synaptic plasticity, allowing optimal implementation of nonlinear functions by single neurons.

Highlights

  • An essential role of each neuron in a circuit is to transform a barrage of synaptic input into a meaningful stream of action potential output

  • It has been proposed that the combination of dendritic morphology and local NMDA receptor-dependent nonlinearities form a hierarchical processing structure with substantial computational power (Mel, 1992b; Archie and Mel, 2000; Poirazi and Mel, 2001)

  • Our approach is general, we focus on understanding how dendritic morphology and NMDA receptor-dependent excitability can be recruited without requiring structured connectivity

Read more

Summary

Introduction

An essential role of each neuron in a circuit is to transform a barrage of synaptic input into a meaningful stream of action potential output. In the first stage of processing, synaptic input is integrated nonlinearly within individual dendrites, followed by a subsequent stage in which current flowing from dendrites is integrated at the soma, elegantly summarized as an equivalence of single neurons to multilayer neural networks (Poirazi et al, 2003a, 2003b; Jadi et al, 2014; Ujfalussy et al, 2018; Beniaguev et al, 2021; Jones and Kording, 2021a) This influential theory is incomplete, as it is yet to be comprehensively determined how or whether computations that capitalize on dendritic physiology can be learned

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call