Abstract

Tactile sensing represents a valuable source of information in robotics for perception of the state of objects and their properties. Modern soft tactile sensors allow perceiving orthogonal forces and, in some cases, relative motions along the surface of the object. Detecting and measuring this kind of lateral motion is fundamental to react to possibly uncontrolled slipping and sliding of the object being manipulated. Object slip detection and prediction have been extensively studied in the robotic community leading to solutions with good accuracy and suitable for closed-loop grip stabilization. However, algorithms for object perception, such as in-hand object pose estimation and tracking algorithms, often assume no relative motion between the object and the hand and rarely consider the problem of tracking the pose of the object subjected to slipping and sliding motions. In this work, we propose a differentiable Extended Kalman filter that can be trained to track the position and the velocity of an object under translational sliding regime from tactile observations alone. Experiments with several objects, carried out on the iCub humanoid robot platform, show that the proposed approach allows achieving an average position tracking error in the order of 0.6 cm, and that the provided estimate of the object state can be used to take control decisions using tactile feedback alone. A video of the experiments is available as Supplementary Material.

Highlights

  • Object perception is one of the key problems of autonomous manipulation as it enables taking informed decisions based on the state of the object

  • Recent approaches proposed in the literature deal with the problem of estimating the 6-dimensional pose of the object from RGB (-D) images using Deep Convolutional Neural Networks (Xiang et al, 2018; Tremblay et al, 2018; Peng et al, 2019; Hodaň et al, 2020)

  • After a section where we review the state of the art on both object pose tracking and slip detection and prediction using tactile sensors, we present our algorithm for object sliding tracking

Read more

Summary

Introduction

Object perception is one of the key problems of autonomous manipulation as it enables taking informed decisions based on the state of the object. Recent approaches proposed in the literature deal with the problem of estimating the 6-dimensional pose of the object from RGB (-D) images using Deep Convolutional Neural Networks (Xiang et al, 2018; Tremblay et al, 2018; Peng et al, 2019; Hodaň et al, 2020) These methods are optimal for detecting the pose of the object in absence of contacts with the end-effector but might suffer in the presence of challenging occlusion, e.g when the robot interacts with the object (Wen et al, 2020). Among the most typical in-hand object motions, object slipping and sliding are challenging to be perceived and controlled For this reason they have been extensively explored and dEKF for Object Sliding Tracking studied within the literature on tactile-based perception and control. Experimental results regarding the slippage are not provided for the real-world scenario and analyzed only in simulation

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call