Abstract

The problem of image rain removal has drawn widespread attention as the blurry images caused by rain streaks can degrade the performance of many computer vision tasks. Although exiting deep learning-based methods outperform most traditional methods, there are still unresolved issues in terms of performance. In this paper, we propose a novel enhanced attentive generative adversarial network named EAGAN to effectively remove the rain streaks and restore the image structural details at the same time. As rain streaks have different sizes and shapes, EAGAN utilizes a multiscale aggregation attention module (MAAM) to produce an attention map to guide the subsequent network to put conscious attention to rain regions. A symmetrical autoencoder with long-range skip-connections, squeeze-and-excitation (SE) modules, and non-local operation is further utilized to enhance the representation of the network. Finally, spectral normalization and a relativistic generative adversarial network (GAN) are further applied to improve the training stability and deraining performance. Both qualitative and quantitative validations on synthetic and real-world datasets demonstrate that the proposed approach can achieve a competitive performance in comparison with the state-of-the-art methods.

Highlights

  • As a common weather condition, rain streaks bring pool visual perception for multimedia applications and significantly degrade the performance of many computer vision algorithms, which are performed on high quality and reliable images, such as object tracking [1], semantic segmentation [2], and outdoor surveillance [3]

  • Existing techniques can be roughly divided into two categories according to their nature, i.e., The associate editor coordinating the review of this manuscript and approving it for publication was Mohamad Forouzanfar

  • To avoid the risk of incorrect luminance information and improve the deraining performance, we propose an enhanced attentive generative adversarial network (EAGAN), which can remove rain streaks and restore the realistic scenes from the original image rather than the decomposed frequency

Read more

Summary

INTRODUCTION

As a common weather condition, rain streaks bring pool visual perception for multimedia applications and significantly degrade the performance of many computer vision algorithms, which are performed on high quality and reliable images, such as object tracking [1], semantic segmentation [2], and outdoor surveillance [3]. Several traditional optimization-based algorithms, such as nonlocal means filter [10], sparse coding [11], low rank approximation [12], representation learning [13] and pre-trained Gaussian mixture models (GMM) [14] have been proposed to deal with the problem These traditional methods attempt to explore certain prior information on texture characteristics to model the rain streaks and separate them from the background [11]–[14], or directly restore the image with nonlocal. To avoid the risk of incorrect luminance information and improve the deraining performance, we propose an enhanced attentive generative adversarial network (EAGAN), which can remove rain streaks and restore the realistic scenes from the original image rather than the decomposed frequency.

RELATED WORK
EXPERIMENTS
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.