A considerable amount of empirical work documents the role of visual perception in spatial cognition and the mental representation of space, while only few studies have been conducted on the role of other senses, in particular auditory perception. The present investigation focuses on the role of audition in wayfinding and the construction of the mental representation of a spatial environment. Our experiments are conducted in a virtual environment (VE), where participants are invited to perform a navigation task in which they locate non-visual sound sources which occupy specific positions in the surrounding space. The VE is purely auditory with no visual feedback. The task is presented in a game form, where sources are virtual bombs emitting a count-down sound. Participants have to find and deactivate them before they explode (which stops the emission of the corresponding sound sources). The data of interest are the participants’ trajectories, their response times, and their performance during the game. Post-experimental tests are also conducted in an attempt to externalize the participants’ mental representation of space via subsequent reconstruction tasks. In the present experimental context, the value of virtual reality is to offer control over the modalities available to the participants, allowing easy comparison of natural and distorted perceptual information. Here, in order to focus on the importance of sound information and to avoid the possible use of sensorimotor contingency cues, the participants do not move physically throughout the scene, but they use a joystick to control translational movements in the VE. The sound scene is rendered over headphones using the 3D audio binaural technique. The sound scene is continually updated taking into account the position and orientation of the participant in the VE. We contrast two conditions for determining the orientation of the participant, either using the actual orientation in the VE, or via a head-tracker. The second option allows the participant to change the orientation of their listening perspective without altering their direction of motion controlled via the joystick. Previous studies have shown the importance of dynamic acoustic cues that the brain correlates with vestibular information for auditory localization (Minnaar 2001; Wightman and Kistler 1999), but such studies have always been conducted in set-ups where the participants’ position and orientation remain fixed in the sound scene. It is the special feature of our approach to investigate the role of head movements in a navigational context. The first data are collected with sighted participants in order to establish a baseline that will further be referred to when blind participants are examined.
Read full abstract