Abstract

Damage-free grasping of deformable objects has been a long-standing difficult problem in the field of robotics. Humans can perceive the physical properties of objects and apply accurate force to complete dexterous and non-destructive operations when grasping vulnerable targets. In order to transfer this ability from humans to robots, a special neural network utilizing the improved Transformer structure is proposed to process the complete tactile time sequence during the grasping, considering the fabulous performance and extensive successful application of deep learning on large-scale datasets. Compared with computer vision, there are far from enough grasp datasets in the haptic domain. Tactile datasets for fruit grasping are almost unavailable. So we established a tactile dataset containing 9375 grasp of 15 fruits for experimental research. The proposed network has achieved a fruit recognition accuracy of 97.33% on this dataset, better than the traditional recurrent neural network (RNN) model. Furthermore, the performance of the proposed model is evaluated from various aspects, and the prediction of the subsequent grasp force is explored. Our work contributes to the realization of robotic haptic perception and damage-free grasping in the agricultural field, and can be helpful to fruit picking, handling, sorting and other related areas. Our method also provides a certain degree of technical reference for other researches on grasp tactile data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call