Abstract

Humanoid robots are designed and expected to resemble humans in structure and behavior, showing increasing application potentials in various fields. Like their biological counterparts, their environmental perception ability is fundamental. In particular, the visual and tactile perception are the two main sensory modes that humanoids use to understand and interact with the environment. Vision-Tactile Fusion Perception (VTFP) has shown multiple possibilities for better sensing understanding in challenging conditions, causing new research interests and questions. The overlap between visual and tactile perception in humanoids is continually growing. This work has reviewed the current state of the art of VTFP. It starts with the physiological basis of biological vision and tactile systems as well as the VTFP mechanisms as inspirations for humanoid perception. Then, the bioinspired visual-tactile fusion systems for humanoids are reviewed as the emphasis. After the survey on the vision and tactile sensors of robots, seven currently publicly available VTFP datasets are introduced. They are the data sources for several studies on neural network-inspired fusion algorithms. Furthermore, the applications of VTFP on humanoids are summarized. Finally, the challenges and future work are discussed. This review aims to provide several references for further exploitation of VTFP and its applications on humanoids.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call