Abstract

Peanut is an essential economic oil crop around the world. Therefore, accurate and real-time detection of peanut seed germination is necessary for peanut field management. However, traditional peanut seedlings’ germination monitoring is time-consuming and labor intense, especially for large fields. In this work, we propose to reduce the time lag in detecting peanut germination failures by combining the power of deep learning-based object detection (OD) and unmanned aerial systems (UAS) to identify early in-field peanut germination. To find the most suitable object detection model, we first compared the performance of two representative OD models, Faster RCNN and SSD, to identify peanut seedlings from UAS imagery obtained through a multispectral camera setup (MicaSense Rededge). The results showed that the F1 score of the SSD model is 0.82, while it is 0.85 for Faster RCNN at an Intersection over Union (IoU) of 0.5. Through extensive ablations, we find that deeper models only marginally improved the performance but were more expensive in computation and inference times. Interestingly we find that the performance of RGB-based seedling detection (0.917 mAP) is comparable to that of R-RedEdge-NIR (0.919 mAP), indicating that a remote sensing setup with a regular RGB camera can perform as well as a more expensive, multispectral camera system to detect peanut seedlings. With extensive experimentation, we infer that cheaper remote sensing mechanisms with the rapid acquisition of UAS-based imagery and the efficiency of OD methods are practical for early peanut germination detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call