Neuronal populations expand their information-encoding capacity using mixed selective neurons. This is particularly prominent in association areas such as the lateral prefrontal cortex (LPFC), which integrate information from multiple sensory systems. However, during conditions that approximate natural behaviors, it is unclear how LPFC neuronal ensembles process space- and time-varying information about task features. Here, we show that, during a virtual reality task with naturalistic elements that requires associative memory, individual neurons and neuronal ensembles in the primate LPFC dynamically mix unconstrained features of the task, such as eye movements, with task-related visual features. Neurons in dorsal regions show more selectivity for space and eye movements, while ventral regions show more selectivity for visual features, representing them in a separate subspace. In summary, LPFC neurons exhibit dynamic and mixed selectivity for unconstrained and constrained task elements, and neural ensembles can separate task features in different subspaces.
Read full abstract