Abstract

LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.