Abstract
In autonomous driving, object detection is considered a base step to many subsequent processes. However, object detection is challenged by loss in visibility caused by rain. Rainfall occurs in two main forms, which are streaks and streaks accumulations. Each degradation type imposes different effect on the captured videos; therefore, they cannot be mitigated in the same way. We propose a lightweight network which mitigates both types of rain degradation in real-time, without negatively affecting the object-detection task. The proposed network consists of two different modules which are used progressively. The first one is a progressive ResNet for rain streaks removal, while the second one is a transmission-guided lightweight network for rain streak accumulation removal. The network has been tested on synthetic and real rainy datasets and has been compared with state-of-the-art (SOTA) networks. Additionally, time performance evaluation has been performed to ensure real-time performance. Finally, the effect of the developed deraining network has been tested on YOLO object-detection network. The proposed network exceeded SOTA by 1.12 dB in PSNR on the average result of multiple synthetic datasets with 2.29× speedup. Finally, it can be observed that the inclusion of different lightweight stages works favorably for real-time applications and could be updated to mitigate different degradation factors such as snow and sun blare.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.