Abstract

The low-cost global navigation satellite systems combined with an inertial navigation system (GNSS/INS) used most frequently for vehicle localization show errors up to 10 m, approximately, even in open-sky environments such as highways. To reduce this error on highways, this paper proposes a localization method based on lane endpoints. Since a lane endpoint frequently appears on a road and can be detected in close proximity even by a low-cost monocular camera, it is a very useful landmark for precise localization. However, the lane width is generally less than 3.5 m, and the localization error from the GNSS is about 10 m. Therefore, if an ego-lane is not identified, the lane endpoints detected in an ego-lane can be falsely corresponded to the lane endpoints in the other lane of a map. This paper proposes an in-lane localization method that uses lane endpoints, the relation between a camera and a road, and the estimated vehicle’s orientation from a map. In addition, this paper proposes an ego-lane identification method that generates a hypothesis about an ego vehicle position per lane by using the proposed in-lane localization method and verifies each hypothesis by the projection of lane endpoints and an additional landmark such as a road sign. The average error of the proposed in-lane localization method is 0.248 m on highways. The success rate of the proposed ego-lane identification method is 99.28% by one trial and reaches 100% by fusing the results.

Highlights

  • Vehicle localization is used to estimate the current global position of a vehicle and is one of the core components in autonomous driving [1]. e most widely adapted localization systems are global navigation satellite systems (GNSSs) which estimate their position through multilateration with satellite signals [2]. e precision of a GNSS can be degraded by the diffused reflection of signals on skyscrapers, signal blocking in tunnels, or atmospheric signal distortion

  • E global positions of lane endpoints and road sign vertices are projected to the camera on the ground truth position, as shown in Figure 9(b). e result that the projected positions of lane endpoints and road sign vertices are almost matched to the true positions, as shown in Figure 9(b), proves that the camera global position calculated from the mobile mapping system (MMS) is very precise

  • E proposed ego-lane identification method perfectly identified an ego-lane by fusing the identification results from four images even when there are false detections of lane endpoints or a road sign

Read more

Summary

Introduction

Vehicle localization is used to estimate the current global position of a vehicle and is one of the core components in autonomous driving [1]. e most widely adapted localization systems are global navigation satellite systems (GNSSs) which estimate their position through multilateration with satellite signals [2]. e precision of a GNSS can be degraded by the diffused reflection of signals on skyscrapers, signal blocking in tunnels, or atmospheric signal distortion. To reduce the localization error caused by signal diffused reflection or signal blocking, a GNSS/INS that combines GNSS and an inertial navigation system (INS) has been researched [4]. More expensive GNSS/INS with very precise INS and RTK GPS can keep their localization errors to less than the width of a lane even in urban areas where the diffused reflection of signals or signal blocking often occurs [5]. This type of system is not affordable for massproduced vehicles because of its high price, and even precise INS cannot avoid cumulative errors, too. An Journal of Advanced Transportation autonomous driving vehicle requires a localization system whose precision is under tens of centimeters rather than tens of meters [6]

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.