Abstract

Human Operators (HO) of telerobotics systems may be able to achieve complex operations with robots. Designing usable and effective Human-Robot Interaction (HRI) is very challenging for system developers and human factors specialists. The search for new metaphors and techniques for HRI adapted to telerobotics systems emerge in the conception of Multimodal HRI (MHRI). MHRI allows to interact naturally and easily with robots due to the combination of many devices and an efficient Multimodal Management System (MMS). A system like this should bring a new user's experience in terms of natural interaction, usability, efficiency and flexibility to HRI systems. So, a good management of multimodality is very important. Moreover, the MMS must be transparent to the user in order to be efficient and natural. Empirical evaluation is necessary to have an idea about the goodness of our MMS. We will use an Empirical Evaluation Assistant (EEA) designed in the IBISC laboratory. EEA permits to rapidly gathering significant feedback about the usability of interaction during the development lifecycle. However, the HRI would be classically evaluated by ergonomics experts at the end of its development lifecycle. Results from a preliminary evaluation on a robot teleoperation task using the ARITI software framework for assisting the user in piloting the robot, and the IBISC semi-immersive VR/AR platform EVR@, are given. They compare the use of a Flystick and Data Gloves for 3D interaction with the robot. They show that our MMS is functional, although multimodality used in our experiments is not sufficient to provide an efficient Human-Robot Interaction. The EVR@ SPIDAR force feedback will be integrated in our MMS to improve the user's efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call