Abstract

Deep learning is a hot research topic in image processing and computer vision. It has been employed in many fields owing to the excellent performance. However, so far, deep learning methods have been rarely applied in the task of plant disease recognition, except that some existing work focuses on it from a public dataset of images zoomed on plant leaves. The main reasons behind the limited usage of deep learning models in plant disease recognition include: the large volume of deep learning models, which is difficult to deploy on embedded systems, the great computational complexity, which requires large memory overhead, the complex backgrounds of experimental materials, which are hard to train an efficient model, and others. To address these challenges, the paper proposes a valid lightweight network architecture, namely MobInc-Net, to perform the crop disease recognition and detection. In this study, the Inception module was enhanced by replacing the original convolutions with depth-wise and point-wise convolutions, and then the modified Inception (M-Inception) module paired with the pre-trained MobileNet was chosen as the backbone extractor to extract high-quality image features. After that, the completely linked Softmax layer with the actual number of categories and the SSD block were separately added behind the foundation network for classifying and detecting crop disease types. To train an efficient model, the two-stage transfer learning was applied in the model training. Experimental findings show that the proposed method can attain the desired performance with an average recognition accuracy of 99.21% on the public dataset and 97.89% on the local dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.