Plants are essential at all stages of living things. Plant pests, diseases, and symptoms are most regularly visible in plant leaves and fruits and sometimes within the roots. Yet, their diagnosis by experts in the laboratory is expensive, tedious, and time-consuming if the samples involve laboratory analysis. Failure to detect early plant symptoms and diseases is the core biotic cause of increased plant stresses, structure, health, reduced subsistence farming, and threats to global food security. To mitigate these problems at a social, economic, and environmental level, inappropriate herbicide application reduction and early plant disease detection and classification (PDDC) are significant solutions in this case. Advancements in transfer learning techniques have resulted in effective results in smart farming and have become extensively used in disease identification and classification research studies. This study presents a novel hybrid inception-xception (IX) using a convolution neural network (CNN). The presented model combines inception and depth-separable convolution layers to capture multiple-scale features while reducing model complexity and overfitting. In contrast to ordinary CNN architectures, it extends the network for better feature extraction, improving PDDC performance that demands diverse feature competencies. It further presents a real-time artificial intelligence (AI) application available in MATLAB, Android, and Servlet to automatically identify and classify diseases based on the leaf environment using improved CNN, machine learning (ML), and computer vision techniques. To assess the presented IX-CNN model performance, different classifiers, namely, support vector machine (SVM), decision tree (DT) and random forest (RF), were used. The experiments used six datasets, including PlantVillage, Turkey Disease, Plant Doc, Rice Disease, RoCole, and NLB datasets. Plant Doc, PlantVillage, and Turkey Disease datasets demonstrated an accuracy of 100%. Rice Disease, RoCole, and NLB attained an accuracy of 99.79%, 99.95%, and 98.64%, respectively.
Read full abstract