Abstract

The common visual SLAM technology is to obtain the scene image information through monocular and binocular cameras, and then process the camera sensor data to get the restored map. However, these camera sensors are not sensitive enough to non-textured scenes to restore the map. It is easy to cause large deviation in image acquisition and cannot restore the corresponding shape. Therefore, in order to address these challenges, this paper proposes a SLAM method that uses polarization camera to obtain polarization characteristics of the target and then calculate the depth information to realize image reconstruction. Specifically, in the SLAM process, we use a monocular polarization camera to acquire an image sequence, and the phase angle and normal vector of each pixel obtained by polarization are used to recover the depth of the textured object, then multi-view normal vector constraints are fused to carry out 3D reconstruction, and finally, in the process of depth propagation, polarized light field information is used to constrain the propagation of non-textured regions. We respectively verified the indoor and outdoor shooting data of single and multiple object scenes, and the reconstruction results are compared with the existing DSO, ORB-SLAM and other algorithms, which shows that the effect of our visual SLAM method is better than that of the existing common SLAM method in reconstructing the surface shape of non-textured objects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.