Abstract

Sequences of precisely timed neuronal activity are observed in many brain areas in various species. Synfire chains are a well-established model that can explain such sequences. However, it is unknown under which conditions synfire chains can develop in initially unstructured networks by self-organization. This work shows that with spike-timing dependent plasticity (STDP), modulated by global population activity, long synfire chains emerge in sparse random networks. The learning rule fosters neurons to participate multiple times in the chain or in multiple chains. Such reuse of neurons has been experimentally observed and is necessary for high capacity. Sparse networks prevent the chains from being short and cyclic and show that the formation of specific synapses is not essential for chain formation. Analysis of the learning rule in a simple network of binary threshold neurons reveals the asymptotically optimal length of the emerging chains. The theoretical results generalize to simulated networks of conductance-based leaky integrate-and-fire (LIF) neurons. As an application of the emerged chain, we propose a one-shot memory for sequences of precisely timed neuronal activity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.