Abstract

Image dehazing has been widely used in vision-based fields, such as detection, segmentation, traffic monitoring, and automated vehicle system. However, most of the existing end-to-end dehazing networks are fully data driven without physical constraints or prior information guidance, leading to difficulties in exploring latent structures and statistical characteristics of hazy images. We propose a novel Retinex decomposition-fusion dehazing network consisting of a dual-branch decomposition module and a fusion optimization module. Different from existing solutions, we decompose the clear images in the commonly used RESIDE dataset based on Retinex theory to construct the clear illumination map and reflection map datasets to drive the network training, equivalently imposing reasonable constraints on the network and achieving impressive dehazing performances. The dual-branch decomposition module is developed to estimate the illumination map and the reflection map, respectively. The illumination map mainly contains the global features of the image, whereas the reflection map reflects the inherent color properties of the image and contains rich details, with which we explore the latent structures and statistical characteristics of hazy images. In addition, the dual-branch structure avoids the error accumulation and information cancellation existing in current methods. Subsequently, the estimated illumination map and reflection map are fused and refined via the fusion optimization module to access the dehazed image. Experiments show that the proposed network has better generalization and visual effects than existing fully data-driven methods and can be applied successfully to real-world scenarios.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.