Abstract

Visual information processing technology is very important in the implementation for sensory substitution of visually impaired persons as well as applications to factory automation. This paper outlines the design of a visual support system that provides 3D visual information using 3D virtual sounds. Three-dimensional information, such as distance map, object recognition, and object tracking required for the visually impaired user is obtained by analyzing images captured by stereo cameras. Using a 3D virtual acoustic display, which relies on Head Related Transfer Functions (HRTFs), the user is informed of the locations and movements of objects. The user's external auditory sense is not impeded as the system uses bone conduction headphones which do not block out environment sounds. The proposed system is expected to be useful in the situations where the infrastructure is incomplete and the situation changes in realtime. We plan experiments using this system to guide users while walking and playing sports.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call