Abstract

We present the design and early-stage implementation of Uasisi, a system to assist the visually impaired in navigating physical spaces. In its simplest form, a single wearable device can assist the user by sensing the proximity of objects in the environment and providing vibratory feedback. More devices can be added in a modular way to form a wireless body area network, and this can, in turn, be connected to a cloud infrastructure where data related to the navigational patterns of the user can be analyzed and help in making further predictions. Our system can also adapt to form part of an augmented environment where smart objects communicate and cooperate with each other to provide a more rich and complex navigation experience for the user. We show a series of experiments where various configurations of the system are tested with users who can not see and must navigate spaces previously unknown to them.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.