Auditory cues are integrated with vision and body-based self-motion cues for motion perception, balance, and gait, though limited research has evaluated their effectiveness for navigation. Here, we tested whether an auditory cue co-localized with a visual target could improve spatial updating in a virtual reality homing task. Participants navigated a triangular homing task with and without an easily localizable spatial audio signal co-located with the home location. The main outcome was unsigned angular error, defined as the absolute value of the difference between the participant's turning response and the correct response towards the home location. Angular error was significantly reduced in the presence of spatial sound compared to a head-fixed identical auditory signal. Participants' angular error was 22.79° in the presence of spatial audio and 30.09° in its absence. Those with the worst performance in the absence of spatial sound demonstrated the greatest improvement with the added sound cue. These results suggest that auditory cues may benefit navigation, particularly for those who demonstrated the highest level of spatial updating error in the absence of spatial sound.
Read full abstract