Abstract
Depth maps acquired by either physical sensors or learning methods are often seriously distorted due to boundary distortion problems, including missing, fake, and misaligned boundaries (compared with RGB images). An RGB-guided depth map recovery method is proposed in this paper to recover true boundaries in seriously distorted depth maps. Therefore, a unified model is first developed to observe all these kinds of distorted boundaries in depth maps. Observing distorted boundaries is equivalent to identifying erroneous regions in distorted depth maps, because depth boundaries are essentially formed by contiguous regions with different intensities. Then, erroneous regions are identified by separately extracting local structures of RGB image and depth map with Gaussian kernels and comparing their similarity on the basis of the SSIM index. A depth map recovery method is then proposed on the basis of the unified model. This method recovers true depth boundaries by iteratively identifying and correcting erroneous regions in recovered depth map based on the unified model and a weighted median filter. Because RGB image generally includes additional textural contents compared with depth maps, texture-copy artifacts problem is further addressed in the proposed method by restricting the model works around depth boundaries in each iteration. Extensive experiments are conducted on five RGB-depth datasets including depth map recovery, depth super-resolution, depth estimation enhancement, and depth completion enhancement. The results demonstrate that the proposed method considerably improves both the quantitative and visual qualities of recovered depth maps in comparison with fifteen competitive methods. Most object boundaries in recovered depth maps are corrected accurately, and kept sharply and well aligned with the ones in RGB images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.