Abstract
This paper presents a novel method for the visualization of 3D spatial sounds in Virtual Reality (VR) for Deaf and Hard-of-Hearing (DHH) people. Our method enhances traditional VR devices with additional haptic and visual feedback, which aids spatial sound localization. The proposed system automatically analyses 3D sound from VR application, and it indicates the direction of sound sources to a user by two Vibro-motors and two Light-Emitting Diodes (LEDs). The benefit of automatic sound analysis is that our method can be used in any VR application without modifying the application itself. We evaluated the proposed method for 3D spatial sound visualization in a user study. Additionally, the conducted user study investigated which condition (corresponding to different senses) leads to faster performance in 3D sound localization task. For this purpose, we compared three conditions: haptic feedback only, LED feedback only, combined haptic and LED feedback. Our study results suggest that DHH participants could complete sound-related VR tasks significantly faster using LED and haptic+LED conditions in comparison to only haptic feedback. The presented method for spatial sound visualization can be directly used to enhance VR applications for use by DHH persons, and the results of our user study can serve as guidelines for the future design of accessible VR systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.