Plant disease negatively impacts food production and quality. It is crucial to detect and recognise plant diseases correctly. Traditional approaches do not offer a rapid and comprehensive management system for detecting plant diseases. Deep learning techniques (DL) have achieved encouraging results in discriminating patterns and anomalies in visual samples. This ability provides an effective method to diagnose any plant disease symptoms automatically. However, one of the limitations of recent studies is that in-field disease detection is underexplored, so developing a model that performs well for in-field samples is necessary. The objective of this study is to develop and investigate DL techniques for in-field disease detection of barley (Hordeum vulgare L.), one of the main crops in Australia, given visual samples captured at barley trials using a consumer-grade RGB camera. Consequently, A dataset was captured from test-bed trials across multiple paddocks infected with three diseases: net form net blotch (NFNB), spot form net blotch (SFNB), and scald, in various weather conditions. The collected data, 312 images (6000 × 4000 pixels), are divided into patches of 448 × 448 pixels, which are manually annotated into four classes: no-disease, scald, NFNB and SFNB. Finally, the data was augmented using random rotation and flip to increase the dataset size. The generated barley disease dataset is then applied to several well-known pre-trained DL networks such as DenseNet, ResNet, InceptionV3, Xception, and MobileNet as the network backbone. Given limited data, these methods can be trained to detect anomalies in visual samples. The results show that MobileNet, Xception, and InceptionV3 performed well in barley disease detection. On the other hand, ResNet showed poor classification ability. Moreover, Augmenting the data improves the performance of DL networks, particularly for underperforming backbones like ResNet, and mitigates the limited data access for these data-intensive networks. The augmentation step improved MobileNet performance by approximately 6 %. MobileNet achieved the highest accuracy of 98.63 % (the average of the three diseases) in binary classification and an accuracy of 93.50 % in multi-class classification. Even though classifying SFNB and NFNB is challenging in the early stages, MobileNet achieved the minimum misclassification rate among the two diseases. The results show the efficiency of this model in diagnosing barley diseases using complex data collected from the field environment. In addition, the model is lighter and comprises fewer trainable parameters. Consequently, MobileNet is suitable for small training datasets, reducing data acquisition costs.
Read full abstract