Abstract

Localization is fundamental for autonomous vehicle applications. Compared with widely developed LiDAR-based vehicle localization, vision based localization has attracted considerable attention in recent years owing to the low-cost sensor. One of the challenges for visual localization is the outlier in measurements due to the appearance changes caused by illumination, season, and weather. To address this problem, we present a real-time robust visual localization approach that achieves deterministic optimality with global convergence. The idea is to decouple the rotation and translation estimation by utilizing the fact that the pitch and roll angles of the query pose are similar to those of the map reference pose, since the vehicle motion is locally planar. Based on the decoupled formulation, we first estimate the optimal yaw angle and eliminate the majority of outliers by an efficient inlier voting method, then find the optimal translation by maximum clique search. The two subproblem estimators are embedded into a prioritized search paradigm to guarantee deterministic optimality. In the experiments, the simulation demonstrates that the proposed method can achieve superior robustness even dealing with extreme outlier rates (95%). Results on both public and self-collected real-world vehicle datasets validate the effectiveness of the proposed method in the real application.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call