Abstract

AbstractThe yield of maize (corn) suffers significant losses due to nutrient deficiencies. Their timely detection is an important task. For this, Machine Learning (ML) models from computer science can be applied. The traditional ML methods involve difficult task of extracting numerous minute features from hundreds of labelled images, by hand. This problem of conventional methods can be solved by the use of ‘transfer learning’ approach. In transfer learning, the learned features from a pre-trained Deep Convolutional Neural Network (CNN) are carried to a new, comparatively small image dataset. The study thus aimed to evaluate and compare three state-of-the-art CNN models for maize deficiency detection, using transfer learning. The pre-training of the CNN models was performed on the Plant Village dataset. Then the models were fine-tuned on a self captured maize deficiency dataset, collected from the fields of S.A.S. Nagar, Punjab (India). Using data augmentation and transfer learning, the experiment shows that DCNNs can be trained, using a few labelled images. The best results were obtained by ZFNet with an accuracy score of 97%, Mean Reciprocal Rank of 99% and Mean Average precision of 98%. The implemented CNNs are reasonable for real-time applications with the classification time less than 1 s per image.KeywordsDeep learningConvolutional neural networksTransfer learningComputer visionImage classificationPattern recognition

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.