Abstract

In the jungle, survival is highly correlated with the ability to detect and distinguish between an approaching predator and a putative prey. From an ecological perspective, a predator rapidly approaching its prey is a stronger cue for flight than a slowly moving predator. In the present study, we use functional magnetic resonance imaging in the nonhuman primate, to investigate the neural bases of the prediction of an impact to the body by a looming stimulus, i.e., the neural bases of the interaction between a dynamic visual stimulus approaching the body and its expected consequences onto an independent sensory modality, namely, touch. We identify a core cortical network of occipital, parietal, premotor, and prefrontal areas maximally activated by tactile stimulations presented at the predicted time and location of impact of the looming stimulus on the faces compared with the activations observed for spatially or temporally incongruent tactile and dynamic visual cues. These activations reflect both an active integration of visual and tactile information and of spatial and temporal prediction information. The identified cortical network coincides with a well described multisensory visuotactile convergence and integration network suggested to play a key role in the definition of peripersonal space. These observations are discussed in the context of multisensory integration and spatial, temporal prediction and Bayesian causal inference.SIGNIFICANCE STATEMENT Looming stimuli have a particular ecological relevance as they are expected to come into contact with the body, evoking touch or pain sensations and possibly triggering an approach or escape behavior depending on their identity. Here, we identify the nonhuman primate functional network that is maximally activated by tactile stimulations presented at the predicted time and location of impact of the looming stimulus. Our findings suggest that the integration of spatial and temporal predictive cues possibly rely on the same neural mechanisms that are involved in multisensory integration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call