Abstract

This letter presents a robust, real-time visual odometry that efficiently exploits the available visual and geometry cues from RGB-D frames for both tracking and mapping. Together with a hybrid tracking algorithm based on a joint multi-objective formulation, we additionally incorporate the point-to-plane metrics into the photometric bundle adjustment (PBA) to constrain the iteration direction especially for those weakly-textured points. The relative pose constraints derived from optimized poses via PBA is then leveraged in combination with reprojection constraints retrieved from maintained feature map, to refine keyframe poses and feature locations. Moreover, the slanted support plane commonly used in multi-view stereo matching, is utilized for the adjustment of the semi-dense points to further enhance mapping accuracy that in turn benefits the front-end tracking. We extensively evaluate our algorithm on benchmark datasets, and those experimental results validate the advantage of our method in terms of overall tracking performance over other representative approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call