Abstract

Small unmanned aerial systems (UAS) have emerged as high-throughput platforms for the collection of high-resolution image data over large crop fields to support precision agriculture and plant breeding research. At the same time, the improved efficiency in image capture is leading to massive datasets, which pose analysis challenges in providing needed phenotypic data. To complement these high-throughput platforms, there is an increasing need in crop improvement to develop robust image analysis methods to analyze large amount of image data. Analysis approaches based on deep learning models are currently the most promising and show unparalleled performance in analyzing large image datasets. This study developed and applied an image analysis approach based on a SegNet deep learning semantic segmentation model to estimate sorghum panicles counts, which are critical phenotypic data in sorghum crop improvement, from UAS images over selected sorghum experimental plots. The SegNet model was trained to semantically segment UAS images into sorghum panicles, foliage and the exposed ground using 462, 250 × 250 labeled images, which was then applied to field orthomosaic to generate a field-level semantic segmentation. Individual panicle locations were obtained after post-processing the segmentation output to remove small objects and split merged panicles. A comparison between model panicle count estimates and manually digitized panicle locations in 60 randomly selected plots showed an overall detection accuracy of 94%. A per-plot panicle count comparison also showed high agreement between estimated and reference panicle counts (Spearman correlation ρ = 0.88, mean bias = 0.65). Misclassifications of panicles during the semantic segmentation step and mosaicking errors in the field orthomosaic contributed mainly to panicle detection errors. Overall, the approach based on deep learning semantic segmentation showed good promise and with a larger labeled dataset and extensive hyper-parameter tuning, should provide even more robust and effective characterization of sorghum panicle counts.

Highlights

  • Recent years have seen unmanned aerial systems (UAS) emerge as effective means for field-relevant phenotyping activities by enabling efficient and more affordable collection of aerial crop images over entire crop growth cycles

  • Accuracy assessment of trained SegNet model: We evaluated the accuracy of the trained deep learning model against remaining 10% test labeled data using two metrics: the overall accuracy (OA) and the intersection over union (IoU)

  • The segmentation performance in terms of intersection over union (IoU) ranged from 80–93% showing a high agreement between the deep learning segmentation and the reference labeled data

Read more

Summary

Introduction

Recent years have seen unmanned aerial systems (UAS) emerge as effective means for field-relevant phenotyping activities by enabling efficient and more affordable collection of aerial crop images over entire crop growth cycles. The massive image data being collected by UAS and other high throughput platforms pose challenges to traditional methods, whose performance tends to level off and fall short of the high accuracy required for fully automated systems [6]. It is not feasible to come up with a comprehensive feature set if an expert must account for variability in thousands of images. This is even more complicated in agricultural environments where image quality is influenced by changes in illumination, crop growth and senescence [4,8]. Our panicle counting approach involved two main steps: (1) deep learning semantic segmentation of inpOutuirmpaagneidclaetacaonudnt(i2n)gPoasptp-prrooaccehssiinngvotlovefadciltiwtaote imndaiinvidsuteaplsp: an(1i)cledceoeupntlienagr.niTnhge sseemmaannttiicc sseeggmmeennttaattiioonn soefrvinepduttoimclaagsesidfyataanadndac(c2u)rPaoteslty-plrooccaetsesipngantioclfeasciwlitiathte ainndiimviadguea.l pTahneicsleeccoonudntsintegp.

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.