Abstract

Determining goal configurations that lead to successful grasps is a critical, time-consuming stage in reach-to-grasp planning, especially in unstructured, cluttered environments. While traditional, analytic algorithms are computation intensive and susceptible to uncertainty, modern, data-driven algorithms do not offer success guarantees and require large datasets for learning models of reach-to-grasp motion. Graspability maps are data structures which store wrist configurations that lead to successful grasps of an object. They are suitable for both direct use in reach-to-grasp motion planning, and as grasp databases for gripper design analysis and for learning grasp models. The computation of graspability maps can be based on analytical models. This facilitates the integration of analytical grasp quality guarantees with data-driven grasp planning. Yet, current graspability map computation methods are prohibitively time-consuming for many application scenarios. In the current work, we suggest a method for adaptation of graspability maps of known objects (shape primitives) to familiar and to unknown objects. The method facilitates run-time generation of graspability maps and significantly enhances their usability. Adapted maps are generated based on detecting shape primitives in the object to be grasped, scaling the a-priori generated maps to the required dimensions, and combining the scaled maps to form a compound graspability map. Simulation results confirm that map adaption does not critically reduce quality while significantly reducing computation time. A case study evaluation with objects from a public point-cloud image database corroborates the method’s ability to quickly and accurately generate high-quality graspability maps for familiar and unknown objects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.