Abstract

Highlights Generated custom imagery dataset using UAS, UGV, and handheld sensors separately in diseased corn fields. Proposed data pipeline utilizing Google Sheets API to establish communication between each platform and enable access through a web application. Developed deep learning-based disease management system for above and below-the-canopy corn disease diagnosis. Trained and evaluated disease detection models from each platform separately to provide management recommendations. Abstract. Early disease management following the onset of disease symptoms is crucial for controlling their spread. Heterogenous collaboration between unmanned aerial systems (UAS) and unmanned ground vehicles (UGV) for field scouting and disease diagnosis is a potential approach for developing automated disease management solutions. However, automation of crop-specific disease identification requires the use of above and below-canopy sensors and properly trained deep learning (DL) models. This research proposes to develop a novel disease management system for diagnosing corn diseases from above and below the canopy by collaboratively using edge devices mounted on UAS and UGV, respectively. Three separate datasets were acquired using UAS above the canopy, UGV below the canopy, and handheld imaging platforms within diseased corn fields. DL-based image classification models were first trained for identifying common corn diseases under field conditions, resulting in up to 95.04% testing accuracy using the DenseNet169 architecture. After creating bounding box annotations for disease images, You Only Look Once (YOLO)v7 DL-based object detection models were trained to identify diseases from each platform separately. Trained YOLOv7 models resulted in the highest mAP@IoU=0.5 of 37.6%, 46.4%, and 72.2% for locating and identifying diseases above the canopy using UAS, below the canopy using UGV, and handheld sensors, respectively. A client/server architecture was developed to establish communication between the UAS, UGV, and Google Spreadsheets via Wi-Fi communication protocol. The coordinates of diseased regions and distinct disease types were recorded using the client/server architecture on Google Spreadsheets. A web application utilized the data from the Google Spreadsheet to help users diagnose diseases in real time and provide recommendations for implementing appropriate disease management practices. This study reports findings of independently operated UAS and UGV that can potentially offer disease spread from below and over the corn canopy by combining the information. Keywords: Deep learning, Disease identification, Disease management, Object detection, UAS, UGV, YOLOv7.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call