Abstract

Traditional robot navigation passively plans/replans to avoid any contact with obstacles in the scene. This limits the obtained solutions to the collision‐free space and leads to failures if the path to the goal is obstructed. In contrast, humans actively modify their environment by repositioning objects if it assists locomotion. This article aims to bring robots closer to such abilities by providing a framework to detect and clear movable obstacles to continue navigation. The approach leverages a multimodal robot skin that provides both local proximity and tactile feedback regarding physical interactions with the surroundings. This multimodal contact feedback is employed to adapt the robot's behavior when interacting with object surfaces and regulating applied forces. This enables the robot to remove bulky obstacles from its path and solves otherwise infeasible navigation problems. The system's ability is demonstrated in simulation and real‐world scenarios involving movable and nonmovable obstacles.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call