Abstract

In this study, a visual-based furrow line detection method was developed for navigating an autonomous robot vehicle in an agricultural field. The furrow line detection method integrates a crop or non-crop field identification method, two types of box filters, which are a color-based furrow detection filter and a grayscale separability-based furrow detection filter, and a robust furrow line parameter estimator. In experiments, the performance of our developed method was tested on more than 8000 images of 17 types of test fields: nine types of crop fields (sweet pea, green pea, snow pea, lettuce, Chinese cabbage, cabbage, green pepper, tomato, and tea), and eight types of tilled soil fields. By using a wide camera angle with a low depression angle, the detection rate of the furrow line was 98.0%, the root mean square error (RMSE) of positioning of the furrow line was 12.1 pixels, and the RMSE of angle of the furrow line was 3.8°. Moreover, by using the oblique camera angle, the detection rate was 93.4%, the RMSE of positioning was 23.3 pixels, and the RMSE of angle was 6.1°. The results showed that our method using the wide and oblique camera angles could approximately detect the furrow line in the test fields. The average processing speed was approximately 2.5 Hz for the crop fields and 4.0 Hz for the tilled soil fields. Our method demonstrated a high potential to robustly and precisely detect a single targeted furrow line in the 17 types of test fields.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.