Abstract

With the prosperity of artificial intelligence, more and more jobs will be replaced by robots. The future of precision agriculture (PA) will rely on autonomous robots to perform various agricultural operations. Real time kinematic (RTK) assisted global positioning systems (GPS) are able to provide very accurate localization information with a detection error less than $\pm 2$ cm under ideal conditions. Autonomously driving a robotic vehicle within a furrow requires relative localization of the vehicle with respect to the furrow centerline. This relative location acquisition requires both the coordinates of the vehicle as well as all the stalks of the crop rows on both sides of the furrow. This extensive number of coordinate acquisitions of all the crop stalks demand onerous geographical survey of entire fields in advance. Additionally, real-time RTK-GPS localization of moving vehicles may suffer from satellite occlusion. Hence, the above-mentioned $\pm 2$ cm accuracy is often significantly compromised in practice. Against this background, we propose sets of computer vision algorithms to coordinate with a low-cost camera (50 US dollars), and a LiDAR sensor (1500 US dollars) to detect the relative location of the vehicle in the furrow during early, and late growth season respectively. Our solution package is superior than most current computer vision algorithms used for PA, thanks to its improved features, such as a machine-learning enabled dynamic crop recognition threshold, which adaptively adjusts its value according to the environmental changes like ambient light, and crop size. Our in-field tests prove that our proposed algorithms approach the accuracy of an ideal RTK-GPS on cross-track detection, and exceed the ideal RTK-GPS on heading detection. Moreover, our solution package neither relies on satellite communication nor advance geographical surveys. Therefore, our low-complexity, and low-cost solution package is a promising localization strategy as it is able to provide the same level of accuracy as an ideal RTK-GPS, yet more consistently, and more reliably, as it requires no external conditions or hassle of the work demanded by RTK-GPS.

Highlights

  • P RECISION agriculture (PA) has become a trending research topic due to the increasing demands for environmental protection, cost reduction and yield enhancement [1], [2]

  • The open-loop performance of the proposed computer vision algorithms are compared with Real time kinematic (RTK)-global positioning system (GPS) during both the early and late growth seasons in Figures 30 and 32, while the RGB filtering and inverted linear regression are jointly used for early growth detection

  • This proves that the proposed computer vision algorithms, when utilized with a low-cost 2D camera or a LiDAR during early or late growth season, are able to provide the rover with the same route as the RTK-GPS, yet at a much lower realization complexity

Read more

Summary

INTRODUCTION

P RECISION agriculture (PA) has become a trending research topic due to the increasing demands for environmental protection, cost reduction and yield enhancement [1], [2]. When the crops are taller than a certain height, the computer vision based algorithm may not be efficacious to detect the crop rows and a LiDAR will be used instead of a camera These new challenges posed by precision agriculture have inspired a large amount of research in crop row detection and autonomous rover localization. While the conference paper only presented results from preliminary test done with tapes on floor imitating the crop rows, we did extensive field tests throughout the early and late growth stage of the crops, during different times of the day, with a large number of combinations of preset cross-track and heading values.

EARLY GROWTH SUBSYSTEM
Image Capture and Undistortion
Triangle Filtering and Centerline Calculation
Locating the Plants in the Image
Calculating the Furrow Lines
Adaptive Anti-Overinflation Adjustments
LATE GROWTH SUBSYSTEM
RESULTS
Late Growth Season
Open and Closed Loop Performance Comparisons
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call