Abstract

Existing robotic systems have a tension between generality and precision. Deployed solutions for robotic manipulation tend to fall into the paradigm of one robot solving a single task, lacking "precise generalization," or the ability to solve many tasks without compromising on precision. This paper explores solutions for precise and general pick and place. In precise pick and place, or kitting, the robot transforms an unstructured arrangement of objects into an organized arrangement, which can facilitate further manipulation. We propose SimPLE (Simulation to Pick Localize and placE) as a solution to precise pick and place. SimPLE learns to pick, regrasp, and place objects given the object's computer-aided design model and no prior experience. We developed three main components: task-aware grasping, visuotactile perception, and regrasp planning. Task-aware grasping computes affordances of grasps that are stable, observable, and favorable to placing. The visuotactile perception model relies on matching real observations against a set of simulated ones through supervised learning to estimate a distribution of likely object poses. Last, we computed a multistep pick-and-place plan by solving a shortest-path problem on a graph of hand-to-hand regrasps. On a dual-arm robot equipped with visuotactile sensing, SimPLE demonstrated pick and place of 15 diverse objects. The objects spanned a wide range of shapes, and SimPLE achieved successful placements into structured arrangements with 1-mm clearance more than 90% of the time for six objects and more than 80% of the time for 11 objects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.