Abstract

Collaboration between aerial and ground robots can benefit from exploiting the complementary capabilities of each system, thereby improving situational awareness and environment interaction. For this purpose, we present a localization method that allows the ground robot to determine and track its position within a map acquired by a flying robot. To maintain invariance with respect to differing sensor choices and viewpoints, the method utilizes elevation maps built independently by each robot's onboard sensors. The elevation maps are then used for global localization: specifically, we find the relative position and orientation of the ground robot using the aerial map as a reference. Our work compares four different similarity measures for computing the congruence of elevation maps (akin to dense, image-based template matching) and evaluates their merit. Furthermore, a particle filter is implemented for each similarity measure to track multiple location hypotheses and to use the robot motion to converge to a unique solution. This allows the ground robot to make use of the extended coverage of the map from the flying robot. The presented method is demonstrated through the collaboration of a quadrotor equipped with a downward-facing monocular camera and a walking robot equipped with a rotating laser range scanner.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call