Abstract

We address the problem of shaping deformable plastic materials using non-prehensile actions. Shaping plastic objects is challenging, because they are difficult to model and to track visually. We study this problem, by using kinetic sand, a plastic toy material that mimics the physical properties of wet sand. Inspired by a pilot study where humans shape kinetic sand, we define two types of actions: pushing the material from the sides and tapping from above. The chosen actions are executed with a robotic arm using image-based visual servoing. From the current and desired view of the material, we define states based on visual features such as the outer contour shape and the pixel luminosity values. These are mapped to actions, which are repeated iteratively to reduce the image error until convergence is reached. For pushing, we propose three methods for mapping the visual state to an action. These include heuristic methods and a neural network, trained from human actions. We show that it is possible to obtain simple shapes with the kinetic sand, without explicitly modeling the material. Our approach is limited in the types of shapes it can achieve. A richer set of action types and multi-step reasoning is needed to achieve more sophisticated shapes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call