Abstract

Raw depth maps captured by depth sensors generally contain missing contents due to glossy, transparent, and sparsity problems. Recent methods well completed flat regions of raw depth maps; however, ignored the accuracy of depth structures. In this paper, an effective depth structure completion method is developed to infer missing depth structures. First, a raw depth map is divided into flat regions and depth structures based on a structure prediction network. Second, two local features including surface normals and Gaussian weights are extracted from a reference RGB image to impose constraints on flat regions and depth structures, separately. Third, a kernel least-square module is adopted to handle the texture-copy artifacts problem. Finally, an iterative optimization model is developed by embedding the two constraints into a Markov random field. The cost function of the model comprises three terms, which limit data fidelity between completed depth map and raw depth map, smoothness of flat regions, and accuracy of depth structures, respectively. The proposed method is evaluated on four indoor datasets including Matterport3D, RealSense, ScanNet, and NYUv2, and compared with eight recent baselines. Quantitative results demonstrate that RMSE and MAE of completed depth maps are considerably reduced by 22.0% and 45.3%, respectively. Visual results show the superiority in completing depth structures and suppressing texture-copy artifacts. Generalization test verify the effectiveness on unseen datasets.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.