Abstract

The design of User Interfaces (UI) is a vital part of Human Machine Interaction (HMI), which affects the performance during collaboration or teleoperation. Ideally, UIs should be intuitive and easy to learn, but their design is challenging especially for complex tasks involving robots with many degrees of freedom. In this paper, we pose the UI design problem as a mapping between an interface device with M input degrees of freedom that generates commands for driving a robot with N output degrees of freedom. We describe a novel adaptive scheme that can learn the N to M input-output map, such that certain task-related performance measures are maximized. The resulting “Genetic Adaptive User Interface” (GAUI), is formulated and utilized to minimize a cost function related to the user teleoperation performance. This algorithm is an unsupervised learning scheme that does not require any knowledge about the robot, the user, or the environment. To validate our approach, we provide simulation and experimental results with a non-holonomic robot and two control interfaces; a joystick and a Myo gesture control armband. Results demonstrate that the adaptively trained map closely mimics the intuitive commands from the joystick interface, and also learns an easily controllable interface with the unintuitive gesture control armband. Abstract formulation of the method allows for easy modifications to the performance measure and application to other HMI tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call