Abstract

Primates use perceptual and mnemonic visuospatial representations to perform everyday functions. Neurons in the lateral prefrontal cortex (LPFC) have been shown to encode both of these representations during tasks where eye movements are strictly controlled and visual stimuli are reduced in complexity. This raises the question of whether perceptual and mnemonic representations encoded by LPFC neurons remain robust during naturalistic vision-in the presence of a rich visual scenery and during eye movements. Here we investigate this issue by training macaque monkeys to perform working memory and perception tasks in a visually complex virtual environment that requires navigation using a joystick and allows for free visual exploration of the scene. We recorded the activity of 3950 neurons in the LPFC (areas 8a and 9/46) of two male rhesus macaques using multielectrode arrays, and measured eye movements using video tracking. We found that navigation trajectories to target locations and eye movement behavior differed between the perception and working memory tasks, suggesting that animals used different behavioral strategies. Single neurons were tuned to target location during cue encoding and working memory delay, and neural ensemble activity was predictive of the behavior of the animals. Neural decoding of the target location was stable throughout the working memory delay epoch. However, neural representations of similar target locations differed between the working memory and perception tasks. These findings indicate that during naturalistic vision, LPFC neurons maintain robust and distinct neural codes for mnemonic and perceptual visuospatial representations.SIGNIFICANCE STATEMENT We show that lateral prefrontal cortex neurons encode working memory and perceptual representations during a naturalistic task set in a virtual environment. We show that despite eye movement and complex visual input, neurons maintain robust working memory representations of space, which are distinct from neuronal representations for perception. We further provide novel insight into the use of virtual environments to construct behavioral tasks for electrophysiological experiments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.