Abstract

Transparency reconstruction has been a challenging problem in active 3D reconstruction, due to the abnormal transparency appearance of invalid depth and wrong depth captured by structured light sensor. This paper proposes a novel method to localize and reconstruct transparency in domestic environment with real-time camera tracking. Based on the Sighed Distance Function(SDF), we estimate the camera pose by minimizing residual error of multiple depth images in the voxel grid. We adopt asymmetric voting of invalid depth to curve the transparency in 3D domain. Concerning the wrong depth caused by transparency, we build a local model to investigate the depth oscillation of each voxel between frames. With the fusion of depth data, we can get the point cloud of transparency and achieve a higher-quality reconstruction of an indoor scene simultaneously. We explore a series of experiments using a hand-held sensor. The results validate that our approach can accurately localize the transparent objects and improve their 3D model, and is more robust against the interference of camera dithering and other noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call