Abstract

Visual information processing technology is very important in the implementation for sensory substitution of visually impaired persons as well as applications to factory automation. This paper outlines the design of a visual support system that provides 3D visual information using 3D virtual sounds. Three-dimensional information, such as distance map, object recognition, and object tracking required for the visually impaired user is obtained by analyzing images captured by stereo cameras. Using a 3D virtual acoustic display, which relies on Head Related Transfer Functions (HRTFs), the user is informed of the locations and movements of objects. The user's external auditory sense is not impeded as the system uses bone conduction headphones which do not block out environment sounds. The proposed system is expected to be useful in the situations where the infrastructure is incomplete and the situation changes in realtime. We plan experiments using this system to guide users while walking and playing sports.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.