Abstract

Vision-based underwater exploration is crucial for marine research. However, the degradation of underwater images due to light attenuation and scattering poses a significant challenge. This results in the poor visual quality of underwater images and impedes the development of vision-based underwater exploration systems. Recent popular learning-based Underwater Image Enhancement (UIE) methods address this challenge by training enhancement networks with annotated image pairs, where the label image is manually selected from the reference images of existing UIE methods since the groundtruth of underwater images do not exist. Nevertheless, these methods encounter uncertainty issues stemming from ambiguous multiple-candidate references. Moreover, they often suffer from local perception and color perception limitations, which hinder the effective mitigation of wide-range underwater degradation. This paper proposes a novel NUAM-Net (Novel Underwater Image Enhancement Attention Mechanism Network) that addresses these limitations. NUAM-Net leverages a probabilistic training framework, measuring enhancement uncertainty to learn the UIE mapping from a set of ambiguous reference images. By extracting features from both the RGB and LAB color spaces, our method fully exploits the fine-grained color degradation clues of underwater images. Additionally, we enhance underwater feature extraction by incorporating a novel Adaptive Underwater Image Enhancement Module (AUEM) that incorporates both local and long-range receptive fields. Experimental results on the well-known UIEBD benchmark demonstrate that our method significantly outperforms popular UIE methods in terms of PSNR while maintaining a favorable Mean Opinion Score. The ablation study also validates the effectiveness of our proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.