Robotic walking devices can be used for intensive exercises to enhance gait rehabilitation therapies. Mixed Reality (MR) techniques may improve engagement through immersive and interactive environments. This article introduces an MR-based multimodal human-robot interaction strategy designed to enable shared control with a Smart Walker. The MR system integrates virtual and physical sensors to (i) enhance safe navigation and (ii) facilitate intuitive mobility training in personalized virtual scenarios by using an interface with three elements: an arrow to indicate where to go, laser lines to indicate nearby obstacles, and an ellipse to show the activation zone. The multimodal interaction is context-based; the presence of nearby individuals and obstacles modulates the robot's behavior during navigation to simplify collision avoidance while allowing for proper social navigation. An experiment was conducted to evaluate the proposed strategy and the self-explanatory nature of the interface. The volunteers were divided into four groups, with each navigating under different conditions. Three evaluation methods were employed: task performance, self-assessment, and observational measurement. Analysis revealed that participants enjoyed the MR system and understood most of the interface elements without prior explanation. Regarding the interface, volunteers who did not receive any introductory explanation about the interface elements were mostly able to guess their purpose. Volunteers that interacted with the interface in the first session provided more correct answers. In future research, virtual elements will be integrated with the physical environment to enhance user safety during navigation, and the control strategy will be improved to consider both physical and virtual obstacles.
Read full abstract