Abstract

Image degradation caused by bad weather, such as haze, rain, and snow, significantly degrades a vision system’s performance, imposing challenges to the autonomous motion of mobile robots. Therefore, it is crucial to restore the degraded images for mobile robots operating outdoors. Spurred by this concern, a framework that restores degraded images and affords simultaneous localization and mapping (SLAM) under multiple bad weather conditions is presented. Specifically, the developed architecture combines the advantages of convolutional neural network features and weather features. It improves the accuracy of identifying the degradation type by developing a weather inference module based on ensemble learning. According to the internal mechanism of image degradation, a degraded image restoration method is proposed utilizing physical models, with a subsequent operation refining the preliminary restored results. To improve our method’s adaptability to real scenes, unpaired real-world weather images are introduced into the degradation removal algorithm through generative adversarial networks. The proposed weather persistent assumption combines weather inference and degradation restoration modules in the SLAM system to capture multiple weather conditions, which improves the system’s accuracy and running speed. Comprehensive experiments evaluating the developed framework and its components highlight that the proposed weather inference and degraded image restoration methods achieve a highly appealing effect. The final experimental results demonstrate that our framework autonomously identifies weather types, triggers the corresponding restoration, and realizes accurate localization in multiple bad weather conditions. It affords SLAM accuracy under multiple bad weather conditions that is close to raw SLAM in clear weather.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call