Abstract

When we move through a real environment, egocentric location representations are effortlessly and automatically updated. While moving in synthetic environments, this effortless, continuous spatial updating is often disrupted or incomplete due to a lack of sensory, especially body-based, movement information. To prevent disorientation in virtual reality caused by missing body-based information, the support of spatial updating via other sensory movement cues is necessary. In the presented experiment, participants performed a spatial updating task in a sparse virtual scene presented inside a CAVE (Cave Automatic Virtual Environment). The task was to navigate back to a starting position after simulated movements with either no orientation cues, three visible distant landmarks or one continuous auditory cue present. The focus was not to compare visual and auditory cues but to explore the viability of auditory cueing with visual cues as a reference. Overall, the data showed improved task performance when an orientation cue was present, with auditory cues providing at least as much improvement as visual cues. Our results indicate that auditory cues in virtual environments can support spatial updating when body-based information is missing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call