Abstract

For most visually impaired people, simple tasks such as understanding the environment or moving safely around it represent huge challenges. The Sound of Vision system was designed as a sensory substitution device, based on computer vision techniques, that encodes any environment in a naturalistic representation through audio and haptic feedback. The present paper presents a study on the usability of this system for visually impaired people in relevant environments. The aim of the study is to assess how well the system is able to help the perception and mobility of the visually impaired participants in real life environments and circumstances. The testing scenarios were devised to allow the assessment of the added value of the Sound of Vision system compared to traditional assistive instruments, such as the white cane. Various data were collected during the tests to allow for a better evaluation of the performance: system configuration, completion times, electro-dermal activity, video footage, user feedback. With minimal training, the system could be successfully used in outdoor environments to perform various perception and mobility tasks. The benefit of the Sound of Vision device compared to the white cane was confirmed by the participants and by the evaluation results to consist in: providing early feedback about static and dynamic objects, providing feedback about elevated objects, walls, negative obstacles (e.g., holes in the ground) and signs.

Highlights

  • Introduction and Related WorkThe World Health Organization (WHO) estimates that at least 2.2 billion persons around the world suffer from blindness or visual impairment

  • Besides their intrinsic value for the validation of the multi-sensory feedback employed in the Sound of Vision system, these results provide further valuable insights for the development of any sensory substitution device for the visually impaired:

  • Two users belonged to category 4, a category of blindness, meaning that visual acuity is less than 10% (FC at 1 m) and equal to or better than light perception

Read more

Summary

Introduction

Introduction and Related WorkThe World Health Organization (WHO) estimates that at least 2.2 billion persons around the world suffer from blindness or visual impairment. Several systems were proposed to help visually impaired people to improve their perception and/or navigation in unknown environments These devices incorporate different technologies and sensors. To obtain a more complex and accurate estimation of environment objects, a sensor fusion system comprising a low-power millimeter wave (MMW) radar and an RGB-Depth (RGB-D) sensor is described in [2,3]. Using this data fusion, the authors ensured the accuracy and stability of the system under any illumination conditions and expand the object detection range up to 80m.

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call