Computer vision provides local environmental information for robotic navigation in crop fields. It is particularly useful for robots operating under canopies of tall plants such as corns (Zea Mays) and sorghums (Sorghum bicolor), where GPS signal is not always receivable. However, the development of under-canopy navigation systems is still an open research area. The key contribution of our work is the development of a vision-based system for under-canopy navigation using a Time-of-Flight (ToF) camera. In the system, a novel algorithm was used to detect parallel crop rows from depth images taken under crop canopies. Two critical tasks in navigation were accomplished based on the detection results: 1) generating crop field maps as occupancy grids when reliable robot localization is available (from other sources such as GPS and IMU), and 2) providing inter-row vehicle positioning data when the field map is available and the localization is not reliable. The proposed system was evaluated in field tests. The test results showed that the proposed system was able to map the crop rows with mean absolute errors (MAE) of 3.4 cm and 3.6 cm in corn and sorghum fields, respectively. It provides lateral positioning data with MAE of 5.0 cm and 4.2 cm for positioning in corn and sorghum crop rows, respectively. The potential and limitations of using ToF cameras for under-canopy navigation were discussed.
Read full abstract