Abstract

The virtual reality (VR) environment is claimed to be highly immersive. Participants may thus be potentially unaware of their real, external world. The present study presented irrelevant auditory stimuli while participants were engaged in an easy or difficult visual working memory (WM) task within the VR environment. The difficult WM task should be immersive and require many cognitive resources, thus few will be available for the processing of task-irrelevant auditory stimuli. Sixteen young adults wore a 3D head-mounted VR device. In the easy WM task, the stimuli were nameable objects. In the difficult WM task, the stimuli were abstract objects that could not be easily named. A novel paradigm using event-related potentials (ERPs) was implemented to examine the feasibility of quantifying the extent of processing of task-irrelevant stimuli occurring outside of the VR environment. Auditory stimuli irrelevant to the WM task were presented concurrently at every 1.5 or 12 s in separate conditions. Performance on the WM task varied with task difficulty, with accuracy significantly lower during the difficult task. The auditory ERPs consisted of N1 and a later P2/P3a deflection which were larger when the auditory stimuli were presented slowly. ERPs were unaffected by task difficulty, but significant correlations were found. N1 and P2/P3a amplitudes were smallest when performance on the Easy WM task was highest. It is possible that even the easy WM task was so immersive and required many processing resources that few were available for the co-processing of the task-irrelevant auditory stimuli.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call