Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function.
Highlights
A fundamental goal in computational neuroscience is to understand how network connectivity and intrinsic neuronal dynamics relate to collective neural activity, and in turn drive neural computation
How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state
We study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties
Summary
A fundamental goal in computational neuroscience is to understand how network connectivity and intrinsic neuronal dynamics relate to collective neural activity, and in turn drive neural computation. Any model of neural activity should capture the often-strong variability in spike trains across time or experimental trials. This variablity in spiking is often coordinated (correlated) across cells, which has a variety of implications. Correlations between synaptic inputs control their effect on postsynaptic neurons: inputs that arrive simultaneously can produce stronger responses than those arriving separately This has been referred to as “synergy” or “synchronous gain” in early work [7], and the magnitude of this synergy has been measured in the LGN by Usrey, Reppas & Reid [8] and cortex by Bruno & Sakmann [9] (but see [10]). Because the events are generated independently at each time point, all of the cumulants of this process can be written as * + Y dN R The delta function arises because the process is independent at each time step, so that there is no correlation between events from one time t and any other time t0.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have