Abstract

Immersive 3D virtual environments such as simulations and serious games for education and training are typically multimodal, incorporating at the very least both visual and auditory cues, each of which may require considerable computational resources, particularly if high fidelity environments are sought. It is widely accepted that sound can influence the other modalities. Our own previous work has shown that sound cues (both contextual and non-contextual with respect to the visual scene) can either increase or decrease (depending on the sound) visual fidelity (quality) perception in addition to the time required to complete a simple task (task completion time) within a virtual environment. However, despite the importance and benefits of spatial sound (sound that goes far beyond traditional stereo and surround sound techniques, allowing users to perceive the position of a sound source at an arbitrary position in three-dimensional space), our previous work did not consider spatial sound cues. Here we will build upon our previous work by describing the results of an experiment that will be conducted to examine visual fidelity (quality) perception and task performance in the presence of various spatial sound cues including acoustical reverberation and occlusion/diffraction effects, while completing a simple task within a virtual environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call