Abstract

Physical characteristics of terrains, such as softness and friction, provide essential information for legged robots to avoid non-geometric obstacles, like mires and slippery stones, in the wild. The perception of such characteristics often relies on tactile perception and vision prediction. Although tactile perception is more accurate, it is limited to close-range use; by contrast, establishing a supervised or self-supervised contactless prediction system using computer vision requires adequate labeled data and lacks the ability to adapt to the dynamic environment. In this paper, we simulate the behavior of animals and propose an unsupervised learning framework for legged robots to learn the physical characteristics of terrains, which is the first report to manage it online, incrementally and with the ability to solve cognitive conflicts. The proposed scheme allows robots to interact with the environment and adjust their cognition in real time, therefore endowing robots with the adaptation ability. Indoor and outdoor experiments on a hexapod robot are carried out to show that the robot can extract tactile and visual features of terrains to create cognitive networks independently; an associative layer between visual and tactile features is created during the robot's exploration; with the layer, the robot can autonomously generate a physical segmentation model of terrains and solve cognitive conflicts in an ever-changing environment, facilitating its safe navigation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.