Body-grounded kinesthetic haptic devices can provide cues for movement in multiple degrees of freedom by exerting forces directly on users, as in dexterous robot teleoperation tasks. However, these haptic devices have limited workspaces, can destabilize a teleoperation control loop, and can be expensive. Portable haptic devices can approximate the sensations of a kinesthetic device by exploiting diverse human sense of touch principles without these shortcomings. Our goal is to analyze the feasibility of hand guidance (HG) using tangential force stimuli. Here we reveal and quantify users’ interpretation of simultaneous tactile stimulation (STS) applied to multiple finger pads of the same hand. We completed an extensive experiment on different users to reveal a maximum number of understandable cues which can be used as movement commands for HG. As expected, many tactile stimuli tested were meaningless for users, but a few could be clearly interpreted — we call these “intuitive movement cues”. For the experiment, we designed a device that can be held in the palm and exerts tactile stimuli to the user's finger pads on the thumb and index fingers, or the thumb and middle fingers. We performed two studies in which we identified the extent of salience of different movement cues. In particular, commands to redirect the hand position and orientation in four axes: moving forward/backward, wrist twisting right/left (rotate clockwise/counter-clockwise), moving right/left, and wrist tilting up/down (rotate upwards/downwards). The results revealed that this approach provided 7 intuitive directional movement cues for relative HG in 3D space. The proposed HG principle is promising for applications such as robotic surgery training, laparoscopic training, and needle insertion training, during which surgical trainees must learn dexterous hand movements involving motion paths. There are many applications for 3D movement guidance outside the medical domain that could benefit from this haptics technology, including training for precise manipulation and assembly tasks, augmented teleoperation, and communication during shared control in collaborative human-machine systems.
Read full abstract