Abstract

Human guidance in situations where the users cannot rely on their main sensory modalities, such as assistive or search-and-rescue scenarios, is a challenging task. In this letter, we address the problem of guiding users along collision-free paths in dynamic environments, assuming that they cannot rely on their main sensory modalities. In order to safely guide the subjects, we adapt the optimal reciprocal collision avoidance to our specific problem. The proposed algorithm takes into account the stimuli which can be displayed to the users and the motion uncertainty of the users when reacting to the provided stimuli. The proposed algorithm was evaluated in three different dynamic scenarios. A total of 18 blindfolded human subjects were asked to follow haptic cues in order to reach a target area while avoiding real static obstacles and moving users. Three metrics such as time to reach the goal, length of the trajectories, and minimal distance from the obstacles are considered to compare results obtained using this approach and experiments performed without visual impairments. Experimental results reveal that blindfolded subjects are successfully able to avoid collisions and safely reach the targets in all the performed trials. Although in this letter we display directional cues via haptic stimuli, we believe that the proposed approach can be general and tuned to work with different haptic interfaces and/or feedback modalities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call