Abstract

This paper presents a state-of-the-art light detection and ranging (LiDAR) based autonomous navigation system for under-canopy agricultural robots. Under-canopy agricultural navigation has been a challenging problem because global navigation satellite system (GNSS) and other positioning sensors are prone to loss of accuracy due to attenuation and multi-path errors caused by crop leaves and stems. Reactive navigation by detecting crop rows using LiDAR measurements has proved to be an efficient alternative to GNSS. Nevertheless, it presents challenges, due to occlusion from leaves under the canopy. Our system addresses these issues by fusing inertial measurement unit (IMU) and LiDAR measurements in a Bayesian framework on low-cost hardware. In addition, a local goal generator (LGG) is introduced to provide a local reference trajectory to the onboard controller. Our system is validated extensively in real-world field environments over a distance of 50.88 km, on multiple robots, in different field conditions, across different locations. We report leading distance between intervention results for LiDAR+IMU-based under-canopy navigation, showing that our system is able to safely navigate without interventions for 386.9 m on average, in fields without significant gaps in the crop rows.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.