Abstract
The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.
Highlights
Stochastic neural dynamics mediate between the underlying biophysical and physiological properties of a neural system and its computational and cognitive properties (e.g. [1,2,3,4])
Neuronal dynamics mediate between the physiological and anatomical properties of a neural system and the computations it performs, may be seen as the ‘computational language’ of the brain
It is of great interest to recover from experimentally recorded time series, like multiple single-unit or neuroimaging data, the underlying stochastic network dynamics and, ideally, even equations governing their statistical evolution
Summary
Stochastic neural dynamics mediate between the underlying biophysical and physiological properties of a neural system and its computational and cognitive properties (e.g. [1,2,3,4]). We commonly have access only to noisy recordings from a relatively small proportion of neurons (compared to the size of the brain area of interest), or to lumped surface signals like local field potentials or the EEG Inferring from these the computationally relevant dynamics is not trivial, especially since both the recorded signals (e.g., spike sorting errors; [5]) as well as the neural system dynamics itself (e.g., stochastic synaptic release; [6]) come with a good deal of noise. Speaking in statistical terms, ’model-free’ techniques which combine delay embedding methods with nonlinear basis expansions and kernel techniques have been one approach to the problem [11; 12] These techniques provide informative lower-dimensional visualizations of population trajectories and (local) approximations to the neural flow field, but they may highlight only certain, salient aspects of the dynamics (but see [13]) and, in any case, do not directly return distribution generating equations or underlying computations. This may often be sufficient to yield lower-dimensional smoothed trajectories, it implies that the recovered dynamical model may be less apt for capturing highly nonlinear dynamical phenomena in the observations, and will by itself not be powerful enough to reproduce a range of important dynamical and computational processes in the nervous system, among them multi-stability which has been proposed to underlie neural activity during working memory [28,29,30,31,32], limit cycles (stable oscillations), or chaos (e.g. [33])
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.