Abstract

Recent investigation in haptic man-robot interaction suggests that there are ultimately only two topical tactile feedback generation modalities for haptic human interfaces. These allow the human operator to handle either (i) temporary virtual reality-based material replicas of the local geometric and/or force profile at the contact areas of an unlimited set of generic objects that could virtually be handled during the manipulation, or (ii) permanent material replicas of a limited set of typical objects. In this paper, the two modalities are analyzed and examples of tactile human interfaces developed by the authors for telerobotic blind tactile exploration of objects, and for telerobotic hapto-visual stylus-style tool manipulation are presented to illustrate the proposed approaches. The necessary modelling of the elastic properties of 3D objects from experimental tactile and range imaging data is also presented using a neural network architecture that becomes an important component of the haptic interface.

Highlights

  • While discussing human perception mechanisms, Sekuler and Balke [1] eloquently stated that... whether exploring gross or small details, the hand and the finger pads convey the most useful tactile information about objects

  • Human haptic perception is the result of a complex investigatory dexterous manipulation act involving two distinct sensing components: (i) tactile, or cutaneous, information from touch sensors which provide data about contact force, local geometric profile, texture, and temperature of the touched object-area, and (ii) kinesthetic information about the positions and velocities of the kinematic structure of the hand [2]

  • Telerobotic dexterous manipulation in changing and unstructured environments combines the low-level robot computer control with the higher-level perception and task planning abilities of a human operator equipped with adequate human interfaces [6]

Read more

Summary

INTRODUCTION

While discussing human perception mechanisms, Sekuler and Balke [1] eloquently stated that. Human haptic perception is the result of a complex investigatory dexterous manipulation act involving two distinct sensing components: (i) tactile, or cutaneous, information from touch sensors which provide data about contact force, local geometric profile, texture, and temperature of the touched object-area, and (ii) kinesthetic information about the positions and velocities of the kinematic structure (bones and muscles) of the hand [2]. This paper discusses the basic generation principles for the local geometric and force profile components of the tactile feedback provided by the haptic human interfaces. This approach allows for the design of specialized haptic human interfaces that are optimized for typical haptic manipulation tasks. The paper concludes with the description of a neural network hapto-visual modeling technique that allows the capture, storage, and rendering in realtime of the complex elastic properties of 3D objects from experimental tactile and range imaging data

TACTILE FEEDBACK GENERATION IN HAPTIC HUMAN INTERFACES
72 D1 D2 thermosensitive elements ferrogenerating element

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.