Abstract

A multi-fingered robot hand receives much attention in various fields. We have developed the multi-fingered robot hand with the multi-axis force/torque sensors. For stable transportation, the robot hand must pick up an object without dropping it and places it without damaging it. This paper deals with a pick-up motion based on vision and tactile information by the eveloped robot hand. Here, the robot hand learns a posture for picking an object up by using tactile values and the visual image in advance, then determines the number of fingers in pick-up motion by the visual image. The effectiveness of the proposed grasp selection is verified through some experiments with the universal robot hand.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.