Abstract

Humanoid robots must be capable of interacting with the world using their hands. A variety of humanlike robot hands have been constructed, but it remains difficult to control these hands in a dexterous way. One challenge is grasp synthesis, where we wish to place the hand and control its shape to successfully grasp a given object. In this paper, we present a datadriven approach to grasp synthesis that treats grasping as a shape matching problem. We begin with a database of grasp examples. Given a model of a new object to be grasped (the query), shape features of the object are compared to shape features of hand poses in these examples in order to identify candidate grasps. For effective retrieval, we develop a novel shape matching algorithm that can accommodate the sparse shape information associated with hand pose and that considers relative placements of contact points and normals, which are important for grasp function. We illustrate our approach with examples using a model of the human hand

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.