Abstract

The most prevalent microbe-caused issues that reduce agricultural output globally are viral and bacterial infections. It is currently quite challenging to identify pathogens due to the current living situation. Biosensors have become the standard for monitoring microbial and viral macromolecules. Disease diagnosis is improved by following the nanoparticles released by infections. Since the sensors' data includes different learning patterns, Machine Learning (ML) methods are used to analyze and interpret it. This research paper aimed to study whether Near-infrared (nIR) and Red, Green, and Blue (RGB) imaging might be used to define and detect Plant Disease (PD) using Convolutional Neural Network (CNN)-based Feature Extraction (FE) and Feature Classification (FC). A home-built Single-Walled Carbon NanoTube (SWCNTs) implemented with a Deoxyribonucleic Acid (DNA) aptamer that binds to a Hemi (HeApt + DNA + SWCNT) sensing device was used to analyze near-infrared (nIR) and RGB images of tea plant leaf samples. Three labels are extracted from the nIR + RGB using a Wasserstein Distance (WD)-based Feature Extraction Model (FEM), and then all those labels are loaded into the proposed CNN model to ensure precise classification. The proposed Wasserstein Distance-to-Convolutional Neural Network (WD2CNN) model was compared to different CNN architectures on the same dataset, achieving the highest accuracy of 98.72%. It is also the most computationally efficient, with the shortest average time per epoch. The model demonstrates high performance and efficiency in classifying biosensor images, which could aid in the early detection and prevention of Crop Diseases (CD).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call