Abstract

AbstractThe accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo‐inertial ORB‐SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in‐house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state‐of‐the‐art approaches. We characterize the performance of GNSS‐stereo‐inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS‐stereo‐inertial baselines. In addition to such analysis, we also release the code of our implementation as open source.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call