Abstract

A country’s financial growth is prejudiced by its rate of agricultural output. Nevertheless, Plant Diseases (PD) pose a substantial obstacle to the cultivation and value of foodstuff. The timely detection of PDs is paramount for public wellness and Sustainable Agriculture (SA) promotion. The conventional diagnostic procedure entails a pathologist’s visual evaluation of a particular plant through in-person visits. Nevertheless, the manual inspection of crop diseases is limited due to its low level of accuracy and the limited availability of skilled workers. To address these concerns, there is a need to develop automated methodologies capable of effectively identifying and classifying a wide range of PDs. The precise detection and categorization of PDs pose a challenging task due to various factors. These include the presence of low-intensity data in both the image’s backdrop and the forefront, the significant similarity in color between normal and diseased plant regions, the presence of noise in the specimens, and the variations in the location, chrominance, framework, and dimensions of plant leaves. This paper presents a novel approach for identifying and categorizing PDs using a Deep Convolutional Neural Network - Transfer Learning (DCNN-TL) technique in the Agricultural Operation System (AOS). The proposed method aims to enhance the capabilities of SA in accurately identifying and categorizing PDs. The improved Deep Learning (DL) methodology incorporates a TL technique based on fine-tuned Visual Geometry Group 19 (VGG19) architecture. The revised system accurately detects and diagnoses five distinct PD categories. Among the evaluated methods, the proposed DCNN-TL in this study shows outstanding precision, recall, and accuracy values of 0.996, 0.9994, and 0.9998, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call