Abstract

Multiple mobile robots have gradually played a key role in many industrial systems, such as factory freight logistics system, patrol security in the factory environment, and multirobot collaborative service and work. As a key issue in industrial environment perception, the accurate robot localization can enhance their autonomous ability and is an important branch of robotic studies in artificial intelligence. In this paper, we propose a new method for cooperative autonomous localization among air-ground robots in a wide-ranging outdoor industrial environment. The aerial robot first maps an area of interest and achieves self-localization. Then the aerial robot transfers a simplified orthogonal perspective 2.5-D map to the ground robots for collaboration. Within the collaboration, the ground robot achieves pose estimation with respect to the unmanned aerial vehicle pose by instantaneously registering a single panorama with respect to the 2.5-D map. The 2.5-D map is used as the spatial association among air-ground robots. The ground robots estimate the orientation using automatically detected geometric information and generates the translation by aligning the 2.5-D map with a semantic segmentation of the panorama. Our method effectively overcomes the dramatic differences between the air-level view and the ground-level view. A set of experiments is performed in the outdoor industrial environment to demonstrate the applicability of our localization method. The proposed robotic collaborative localization outperforms most consumer sensor in accuracy and also has an outstanding running time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call