Abstract

Apart from high-level computer vision tasks, deep learning has also made significant progress in low-level tasks, including single image dehazing. A well-detailed image looks realistic and natural with its clear edges and balanced colour. To achieve a clearer and vivid view, we exploit the role of edges and colours as a significant part of our proposed work. A progressive two-stage image dehazing network is presented to overcome the challenges of current image dehazing algorithms. The proposed image dehazing framework is divided into two steps; in the first stage, the multiscale image features of the encoder and decoder structure can be extracted. The second stage consists of the Color Correction Model (CCM), which retrieves balanced colour close to the ground truth. The encode-decoder network consists of a dense residual attention unit (DRAU) that comprises channel attention with pixel attention mechanisms. We have seen that weighted information and the haze difference is inconsistent across pixels without DRAU at the various channel-specific features. DRAU deals with different features and pixels unequally, which offers more versatility in handling knowledge of various types of detailed information. Our proposed two-stage network exceeds state-of-the-art algorithms in both visual and quantitative aspects. The findings are tested with the best-published peak signal-to-noise ratio metrics of 33.55–33.44 dB and SSIM 0.9619–0.9714 on SOTS indoor and outdoor test data sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.