Abstract

AbstractThis study investigates the potential of high‐resolution (<0.5 cm/pixel) aerial imagery and convolutional neural networks (CNNs) for disease incidence scoring in sugar beet, focusing on two important aphid‐transmitted viruses, beet mild yellowing virus (BMYV) and beet chlorosis virus (BChV). The development of tolerant sugar beet cultivars is imperative in the context of increased disease management concerns due to the ban on neonicotinoids in the European Union. However, traditional methods of disease phenotyping, which rely on visual assessment by human experts, are both time‐consuming and subjective. Therefore, this study assessed whether aerial multispectral and RGB images could be harnessed to perform automated disease ratings comparable to those performed by trained experts. To this end, two variety trials were conducted in 2021 and 2022. The 2021 dataset was used to train and validate a CNN model on five cultivars, while the 2022 dataset was used to test the model on two cultivars different from those used in 2021. Additionally, this study tests the use of transformed features instead of raw spectral bands to improve the generalization of CNN models. The results showed that the best CNN model was the one trained for BMYV on RGB images using transformed features instead of conventional raw bands. This model achieved a root mean square error score of 11.45% between the model and expert scores. These results indicate that while high‐resolution aerial imagery and CNNs hold great promise, a complete replacement of human expertise is not yet possible. This research contributes to an innovative approach to disease phenotyping, driving advances in sustainable agriculture and crop breeding.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call