Abstract

Accuracyand robustness are among the main concerns in vehicle positioning systems andautonomous applications. These concerns are crucial in GNSS-deniedenvironments; thus, we need an alternative technology to overcome this problem.In recent years, vision-based localization known as visual odometry has gainedconsiderable attention among researchers. Visual odometry is a vision-basedpose estimation and it has been developed for mobile object localization suchas robots and vehicles while perceiving their environment. Within the lastdecade, researchers have been immersed in developing techniques to achievehighly accurate and precise localization based on visual odometry. The visualodometry performances are evaluated using anonline dataset for benchmarking. Based on the benchmarking, this study reviewsand compares the robustness of the recent visual odometry techniques forapplication, especially in vehicle localization in various road conditions. Evaluation methods for the selectedtechniques are presented and a thorough analysis of each driving sequence isconducted. The analysis shows that for all visual odometry techniques,localization for high-speed drive suffers higher translation error even thoughthe surrounding has less image noise. Despite that, visual odometry thatimplements careful feature Selection and Tracking (SOFT) proves to be morerobust compared with other techniques with 0.7% relative translation error anda relative rotation error of 0.2 deg/hm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.