Remote sensing images often suffer from the degradation effects of atmospheric haze, which can significantly impair the quality and utility of the acquired data. A novel dehazing method leveraging generative adversarial networks is proposed to address this challenge. It integrates a generator network, designed to enhance the clarity and detail of hazy images, with a discriminator network that distinguishes between dehazed and real clear images. Initially, a dense residual block is designed to extract primary features. Subsequently, a wavelet transform block is designed to capture high and low-frequency features. Additionally, a global and local attention block is proposed to reduce the interference of redundant features and enhance the weight of important features. PixelShuffle is used as the upsampling operation, allowing for finer control of image details during the upsampling process. Finally, these designed modules are integrated to construct the generator network for image dehazing. Moreover, an improved discriminator network is proposed by adding a noise module to the conventional discriminator, enhancing the network’s robustness. A novel loss function is introduced by incorporating the color loss function and SSIM loss function into traditional loss functions, aiming to improve color accuracy and visual distortion assessment. This approach attains the highest PSNR and SSIM scores when compared to current leading methods. The proposed dehazing technique for remote sensing images successfully maintains color fidelity and detail, leading to significantly clearer images.
Read full abstract