Abstract

The fusion of shape from polarization (SFP) and depth sensors is an effective method for high-quality 3D reconstruction. However, accurate registration of images from different sensors is a difficult problem. In this paper, we propose a 3D reconstruction method based on the fusion of SFP and polarization-modulated ranging (PMR) for the first time, where only a single image sensor is utilized to acquire both polarized images and depth data. Therefore, the challenging image registration problem can be avoided. This method is based on the following observation: SFP can retrieve inaccurate objects’ 3D shapes with fine textures, whereas PMR can provide coarse but accurate absolute depths, resulting in a good necessity of fusing these two modalities. To this end, we propose two fusion models: a joint azimuth estimation model to obtain a fused azimuth angle with π−ambiguity corrected, and a joint zenith estimation model to estimate an accurate fused zenith angle. Finally, the reconstructed depth is integrated from the fused azimuth and zenith angles, which can provide accurate absolute depth with fine textures. Extensive experiments have been conducted to verify the efficiency of the proposed method over other state-of-the-art methods besides its robustness to the targets with different kinds of features, demonstrating its broad application prospects in 3D reconstruction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call