Abstract

The information processing abilities of neural circuits arise from their synaptic connection patterns. Understanding the laws governing these connectivity patterns is essential for understanding brain function. The overall distribution of synaptic strengths of local excitatory connections in cortex and hippocampus is long-tailed, exhibiting a small number of synaptic connections of very large efficacy. At the same time, new synaptic connections are constantly being created and individual synaptic connection strengths show substantial fluctuations across time. It remains unclear through what mechanisms these properties of neural circuits arise and how they contribute to learning and memory. In this study we show that fundamental characteristics of excitatory synaptic connections in cortex and hippocampus can be explained as a consequence of self-organization in a recurrent network combining spike-timing-dependent plasticity (STDP), structural plasticity and different forms of homeostatic plasticity. In the network, associative synaptic plasticity in the form of STDP induces a rich-get-richer dynamics among synapses, while homeostatic mechanisms induce competition. Under distinctly different initial conditions, the ensuing self-organization produces long-tailed synaptic strength distributions matching experimental findings. We show that this self-organization can take place with a purely additive STDP mechanism and that multiplicative weight dynamics emerge as a consequence of network interactions. The observed patterns of fluctuation of synaptic strengths, including elimination and generation of synaptic connections and long-term persistence of strong connections, are consistent with the dynamics of dendritic spines found in rat hippocampus. Beyond this, the model predicts an approximately power-law scaling of the lifetimes of newly established synaptic connection strengths during development. Our results suggest that the combined action of multiple forms of neuronal plasticity plays an essential role in the formation and maintenance of cortical circuits.

Highlights

  • The computations performed by cortical circuits depend on their detailed patterns of synaptic connection strengths

  • We present a computational model that explains these fundamental properties of neural circuits as a consequence of network self-organization resulting from the combined action of different forms of neuronal plasticity

  • This selforganization is driven by a rich-get-richer effect induced by an associative synaptic learning mechanism which is kept in check by several homeostatic plasticity mechanisms stabilizing the network

Read more

Summary

Introduction

The computations performed by cortical circuits depend on their detailed patterns of synaptic connection strengths. While the gross patterning of connections across different cortical layers has been well described in some cases [1,2], the detailed connectivity structure between groups of cells and its relation to information processing have been notoriously difficult to investigate [3]. This detailed structure could either be largely random – the product of somewhat arbitrary growth processes, or it could be highly organized. We show that fundamental aspects of the microscopic structure of cortical networks can be understood as the product of selforganization

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.