Abstract
Underwater images inevitably suffer from degradation and blur due to the scattering and absorption of light as it propagates through the water, which hinders the development of underwater visual perception. Existing deep underwater image enhancement methods mainly rely on the strong supervision of a large-scale dataset composed of aligned raw/enhanced underwater image pairs for model training. However, aligned image pairs are not available in most underwater scenes. This work aims to address this problem by proposing a novel weakly supervised underwater image enhancement (named WSUIE) method. Firstly, a novel generative adversarial network (GAN)-based architecture is designed to enhance underwater images by unpaired image-to-image transformation from domain <b xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">X</b> (raw underwater images) to domain <b xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Y</b> (arbitrary high-quality images), which alleviates the need for aligned underwater image pairs. Then, a new objective function is formulated by exploring intrinsic depth information of underwater images to increase the depth sensitivity of our method. In addition, a dataset with unaligned image pairs (named UUIE) is provided for the model training. Many qualitative and quantitative evaluations of the WSUIE method are performed on this dataset, and the results show that this method can provide improved visual perception performance while enhancing visual quality of underwater images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.