Abstract

Applications for dexterous robot teleoperation and immersive virtual reality are growing. Haptic user input devices need to allow the user to intuitively command and seamlessly “feel” the environment they work in, whether virtual or a remote site through an avatar. We introduce the DLR Exodex Adam, a reconfigurable, dexterous, whole-hand haptic input device. The device comprises multiple modular, three degrees of freedom (3-DOF) robotic fingers, whose placement on the device can be adjusted to optimize manipulability for different user hand sizes. Additionally, the device is mounted on a 7-DOF robot arm to increase the user’s workspace. Exodex Adam uses a front-facing interface, with robotic fingers coupled to two of the user’s fingertips, the thumb, and two points on the palm. Including the palm, as opposed to only the fingertips as is common in existing devices, enables accurate tracking of the whole hand without additional sensors such as a data glove or motion capture. By providing “whole-hand” interaction with omnidirectional force-feedback at the attachment points, we enable the user to experience the environment with the complete hand instead of only the fingertips, thus realizing deeper immersion. Interaction using Exodex Adam can range from palpation of objects and surfaces to manipulation using both power and precision grasps, all while receiving haptic feedback. This article details the concept and design of the Exodex Adam, as well as use cases where it is deployed with different command modalities. These include mixed-media interaction in a virtual environment, gesture-based telemanipulation, and robotic hand–arm teleoperation using adaptive model-mediated teleoperation. Finally, we share the insights gained during our development process and use case deployments.

Highlights

  • With our hands, we can communicate, explore the world around us, manipulate it, and mold it

  • To ascertain a desirable point for attachment for the user’s index and middle fingers to the robotic fingers, we examined the available workspace for three planar locations: plain distal, distal-palmar, and distal-dorsal

  • To examine the suitable workspace that these pose adjustments can provide for the user, we explored different attachment configurations of the user’s thumb by examining the possible workspace with the user’s thumb attached to the robotic finger in different base configurations

Read more

Summary

Introduction

We can communicate (read Braille, make gestures, or speak sign language), explore the world around us (feel surface impedances, textures, weights, temperature, pressure), manipulate it, and mold it. The somatosensory system of the human is essential This includes the knowledge about the orientation and position of our body in space (proprioception) and the sense of motion in our joints (kinesthesia) as well as perception of sensory signals from the mechanoreceptors in our skin (cutaneous perception) (Hannaford and Okamura, 2016). All these senses contribute to our ability to receive haptic feedback when interacting with the environment. Haptic interaction is still less widespread but is gaining more interest as haptic technology develops

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call