Abstract
Due to the unavailability of large-scale underwater depth image datasets and ill-posed problems, underwater single-image depth prediction is a challenging task. An unambiguous depth prediction for single underwater image is an essential part of applications like underwater robotics, marine engineering, and so on. This article presents an end-to-end underwater generative adversarial network (UW-GAN) for depth estimation from an underwater single image. Initially, a coarse-level depth map is estimated using the underwater coarse-level generative network (UWC-Net). Then, a fine-level depth map is computed using the underwater fine-level network (UWF-Net) which takes input as the concatenation of the estimated coarse-level depth map and the input image. The proposed UWF-Net composes of spatial and channel-wise squeeze and excitation block for fine-level depth estimation. Also, we propose a synthetic underwater image generation approach for large-scale database. The proposed network is tested on real-world and synthetic underwater datasets for its performance analysis. We also perform a complete evaluation of the proposed UW-GAN on underwater images having different color domination, contrast, and lighting conditions. Presented UW-GAN framework is also investigated for underwater single-image enhancement. Extensive result analysis proves the superiority of proposed UW-GAN over the state-of-the-art (SoTA) hand-crafted, and learning-based approaches for underwater single-image depth estimation (USIDE) and enhancement.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.