Abstract
Noise-delayed decay occurs when the first-spike latency of a periodically forced neuron exhibits a maximum at particular noise intensity. Here we investigate this phenomenon at the network level, in particular by considering scale-free neuronal networks, and under the realistic assumption of noise being due to the stochastic nature of voltage-gated ion channels that are embedded in the neuronal membranes. We show that noise-delayed decay can be observed at the network level, but only if the synaptic coupling strength between the neurons is weak. In case of strong coupling or in a highly interconnected population the phenomenon vanishes, thus indicating that delays in signal detection can no longer be resonantly prolonged by noise. We also find that potassium channel noise plays a more dominant role in the occurrence of noise-delayed decay than sodium channel noise, and that poisoning the neuronal membranes may weakens or intensify the phenomenon depending on targeting.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Chaos, Solitons and Fractals: the interdisciplinary journal of Nonlinear Science, and Nonequilibrium and Complex Phenomena
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.