Abstract

Image dehazing is a critical image pre-processing task to estimate the haze-free images corresponding to the input hazy images. Despite the recent advances, the task of image dehazing remains challenging, especially in the unsupervised scenario. Several efforts can be found in the literature to dehaze images in a supervised set-up, where a huge number of paired (clear and hazy images) images are required for training. The supervised approaches often become biased towards the nature of haze present in the training hazy images, and produce less realistic images for query hazy images. We propose an Attention-based Global–Local Cycle-consistent Generative Adversarial Network (AGLC-GAN) for Unpaired Single Image Dehazing. The proposed CycleGAN-based AGLC-GAN model contains a dehazing generator encapsulating an autoencoder-like network with an attention mechanism comprising channel attention and pixel attention to deal with uneven haze intensity across the image. We use a global–local consistent discriminator to identify spatially varying haze and improve the stability of the discriminator. We adopt cyclic perceptual consistency loss to maintain consistency in the feature space. A dynamic feature enhancement module and an adaptive mix-up module are included in the proposed generator to dynamically obtain more spatially structured features and hence, adaptively preserve the flow of shallow features. Furthermore, we extensively experiment with the proposed model on multiple benchmark datasets for evaluating the efficacy of removing haze. The results of the experiments conducted in the study, demonstrate a significant quantitative and qualitative improvement over the existing methods for unpaired image dehazing. 11The code is available at the following link: https://github.com/jr925/AGLC-GAN

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call