Abstract

The use of unmanned aerial vehicle (UAV) provide a timely and low-cost means of accessing high spatial resolution imagery for crop disease detection. In this study, convolutional neural networks (CNNs) and RGB-based high spatial resolution images from UAVs were explored to detect wheat stripe rust transmission centers (Infected area accounted less than 1.35 %) occurrence in complex fields conditions in Hubei, China. To take full advantage of end-to-end learning capabilities, CNNs semantic segmentation architecture (deeplabv3+) was applied to per-pixel classify the imagery for the detection of healthy wheat and stripe-rust-infected wheat (SRIW). Using a rich dataset with diverse field conditions and sunlight illumination properties, we were able to accurately detect SRIW (Rust class F1 = 0.81). The study also evaluated the impact of classification framework and spatial resolution on model training. It revealed that the model accuracies improved for the rust class when the multi-branching binary framework instead of the multi-classification framework for CNN training with unbalanced classes. A coarser spatial resolution (8 cm) significantly decreased the model accuracy (Rust class F1-score). In addition, the Macro-disease index(MDI) was defined to quantitatively measure the occurrence of SRIW. Our results demonstrate the capability of ultra-high spatial resolution UAV imaging in detecting SRIW. With the end-to-end deep learning segmentation method greatly reducing the need for intensive preprocessing, the combination of CNNs and RGB-based ultra-high spatial resolution images from UAVs provides a simple and rapid method for accurate detection of crop disease on a large scale.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call