Abstract

Navigation in virtual environments relies on an accurate spatial rendering. A virtual object is localized according to its position in the environment, which is usually defined by the following three coordinates: azimuth, elevation and distance. Even though several studies investigated the perception of auditory and visual cues in azimuth and elevation, little has been made on the distance dimension. This study aims at investigating the way humans estimate visual and auditory egocentric distances of virtual objects. Subjects were asked to estimate the egocentric distance of 2–20m distant objects in three contexts: auditory perception alone, visual one alone, combination of both perceptions (with coherent and incoherent visual and auditory cues). Even though egocentric distance was under-estimated in all contexts, the results showed a higher influence of visual information than auditory information on the perceived distance. Specifically, the bimodal incoherent condition gave perceived distances equivalent to those in the visual-only condition only when the visual target was closer to the subject than the auditory target.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call