Abstract

In many brain areas, sensory responses are heavily modulated by factors including attentional state, context, reward history, motor preparation, learned associations, and other cognitive variables. Modelling the effect of these modulatory factors on sensory responses has proven challenging, mostly due to the time-varying and nonlinear nature of the underlying computations. Here we present a computational model capable of capturing and dissociating multiple time-varying modulatory effects on neuronal responses on the order of milliseconds. The model’s performance is tested on extrastriate perisaccadic visual responses in nonhuman primates. Visual neurons respond to stimuli presented around the time of saccades differently than during fixation. These perisaccadic changes include sensitivity to the stimuli presented at locations outside the neuron’s receptive field, which suggests a contribution of multiple sources to perisaccadic response generation. Current computational approaches cannot quantitatively characterize the contribution of each modulatory source in response generation, mainly due to the very short timescale on which the saccade takes place. In this study, we use a high spatiotemporal resolution experimental paradigm along with a novel extension of the generalized linear model framework (GLM), termed the sparse-variable GLM, to allow for time-varying model parameters representing the temporal evolution of the system with a resolution on the order of milliseconds. We used this model framework to precisely map the temporal evolution of the spatiotemporal receptive field of visual neurons in the middle temporal area during the execution of a saccade. Moreover, an extended model based on a factorization of the sparse-variable GLM allowed us to disassociate and quantify the contribution of individual sources to the perisaccadic response. Our results show that our novel framework can precisely capture the changes in sensitivity of neurons around the time of saccades, and provide a general framework to quantitatively track the role of multiple modulatory sources over time.

Highlights

  • In many brain areas, ‘associative’ regions including parietal and prefrontal cortex, sensory processing is affected by various intrinsic or extrinsic nonsensory covariates such as task or context variables, attention, learned associations, motor preparation, or cognitionrelated control signals

  • The model’s performance is evaluated by testing its ability to reproduce and dissociate multiple changes in visual sensitivity occurring in extrastriate visual cortex around the time of rapid eye movements

  • To address the need for a quantitative means to study nonstationary responses across a saccade, we recently developed an extension of the widely-used generalized linear models (GLMs) [42,43,44,45,46,47], termed the nonstationary generalized linear model (NSGLM, referred to as the N-model in this article) to describe response dynamics in visual neurons across a saccade [48]

Read more

Summary

Introduction

‘associative’ regions including parietal and prefrontal cortex, sensory processing is affected by various intrinsic or extrinsic nonsensory covariates such as task or context variables, attention, learned associations, motor preparation, or cognitionrelated control signals. The fact that multiple such variables may simultaneously modulate sensory activity, and that their influence can change rapidly over the course of a task, poses challenges for precise experimental or computational quantification of their relative contributions to neuronal responses. There are several reports suggesting that the spatial distribution of the population of visual neurons’ RFs changes during or just prior to a saccade [19,20,21]. Taken together, these findings indicate that the perisaccadic responses evoked in visual neurons are modulated by several sources, i.e. stimuli perisaccadically presented at multiple locations in the visual field contribute to driving neurons’ responses

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call