Abstract

In this paper, the Virtually Enhanced Senses (VES) System is described. It is an ARCore-based, mixed-reality system meant to assist blind and visually impaired people’s navigation. VES operates in indoor and outdoor environments without any previous in-situ installation. It provides users with specific, runtime-configurable stimuli according to their pose, i.e., position and orientation, and the information of the environment recorded in a virtual replica. It implements three output data modalities: Wall-tracking assistance, acoustic compass, and a novel sensory substitution algorithm, Geometry-based Virtual Acoustic Space (GbVAS). The multimodal output of this algorithm takes advantage of natural human perception encoding of spatial data. Preliminary experiments of GbVAS have been conducted with sixteen subjects in three different scenarios, demonstrating basic orientation and mobility skills after six minutes training.

Highlights

  • Recent scientific and technological advances have opened new possibilities in the development of navigation systems, i.e., systems that provide required or helpful data to get to a destination point, adapted for blind and visually impaired (BVI) users

  • The Virtually Enhanced Senses (VES) prototype described throughout this document has been assessed as a versatile platform to test non-visual human–machine interfaces for BVI navigation in indoor and outdoor, and real and virtual environments

  • It is a software application running in commodity devices, which eases its usage for experimentation purposes, and might encourage further development in the field

Read more

Summary

Introduction

Recent scientific and technological advances have opened new possibilities in the development of navigation systems, i.e., systems that provide required or helpful data to get to a destination point, adapted for blind and visually impaired (BVI) users. As described in a previous review, this has limited the potential of several proposals [3]. These interfaces must provide adequate data output to the remnant sensory capabilities. In the event of severe or total sensory loss, it is necessary to make use of alternative sensory modalities In this context, the present work focuses on developing more effective and efficient non-visual, human–machine interfaces for navigation systems for BVI people. To previous projects, the visual and acoustic output, as well as the virtual scenario, can be configured from a server at runtime. Scenarios of various sizes and complexity can be used, from simple mazes to urban environments with moving vehicles

Related Work
The Virtually Enhanced Senses System
User Requirements
Experimental
Results
Discussion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call