Abstract

While the visual system provides the dominating sensory input in sighted humans for awareness, orientation, and navigation, in low-light conditions, smoke, or in visually-impaired and blind individuals, auditory perception becomes important. Acoustic information about the environment includes sounds radiated from sources, as well as reflections and reverberation in enclosed spaces in response to either external sound sources or self-produced sounds (e.g., mouth clicks), referred to as echolocation. Here, we investigated orientation and navigation in typical corridors based on acoustic cues only, using sighted humans without training and real-time virtual acoustics. Virtual sound sources and echo-location with predefined sounds as well as own vocalizations were used with the goal to identify suitable techniques for an acoustically augmented reality (AAR) based mobility aid. A hand-held acoustic pointer, rendering virtual sound sources at the closest wall in the pointing direction was best suited for orientation and navigation. For identification of the room shape, a performance similar to that obtained with a reference (visual) laser pointer could be achieved with the acoustic pointer in specific conditions. In addition to the above acoustic pointer, an AAR-based echolocation pointer with highly directed sound radiation, as achievable in the ultrasonic range, was additionally suggested.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call