Abstract

This paper presents a novel, wearable navigation system for visually impaired and blind pedestrians that combines a global positioning system (GPS) for user outdoor localization and tactile-foot stimulation for information presentation. Real-time GPS data provided by a smartphone are processed by dedicated navigation software to determine the directions to a destination. Navigational directions are then encoded as vibrations and conveyed to the user via a tactile display that inserts into the shoe. The experimental results showed that users were capable of recognizing with high accuracy the tactile feedback provided to their feet. The preliminary tests conducted in outdoor locations involved two blind users who were guided along 380–420 m predetermined pathways, while sharing the space with other pedestrians and facing typical urban obstacles. The subjects successfully reached the target destinations. The results suggest that the proposed system enhances independent, safe navigation of blind pedestrians and show the potential of tactile-foot stimulation in assistive devices.

Highlights

  • Navigation assistive technology for visually impaired and blind people has been an active subject of study for decades.Two different processes of human mobility have been identified for navigation assistive system design: sensing of the immediate environment and orientation during travel [1]

  • These are due to an optimized tactile rendering approach that tackles two perception issues identified in previous experiments: (1) Missed vibrations: the first three vibrations prevent users from missing a vibration and failing at direction identification; and (2) Inaccurate discrimination: the fourth vibration points to the opposite direction

  • Assistive technology can benefit from these ubiquitous computing resources to improve the quality of life of people in need

Read more

Summary

Introduction

Two different processes of human mobility have been identified for navigation assistive system design: sensing of the immediate environment and orientation during travel [1]. While the former refers to the gathering of spatial information for obstacle (or any other travel obstruction) detection, the latter involves the update of the traveler’s location in a route and the continuous guidance to reach a destination. Examples of obstacle detection systems can be tracked back to the 1970s when sonar technology was at its peak [2,3]. A short while later, sonar systems evolved to ultrasonic sensors, improving the measurement accuracy of the distance to an obstacle [4,5].

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call