Abstract
Classifying and mapping weeds are crucial for ensuring that plantations remain free of unwanted plants, particularly in cases where plant spacing is uneven, making weed detection more challenging than it is for crops. To address this issue, Deep Neural Networks (DNNs) are often used in agriculture to identify the unique features of weeds and classify them based on low-resolution imaging, which helps to control weed populations. Convolutional Neural Networks (CNN) in Deep Learning (DL) models have advanced, achieving remarkable performances in plant-weed classification. Despite Convolutional Neural Networks (CNNs) success in computer vision, they still need help with massive labeled datasets, intra-class classification, and high processing costs when used in classification tasks. Meanwhile, transformer-based models have shown significant modeling capabilities globally but are yet to be thoroughly investigated in agriculture. In this study, we explore Self-Attention Vision Transformers (ViT) using a Transfer Learning (TL) approach to classify weeds in benchmark datasets. The proposed model, ViT-S16, was fine-tuned on the ImageNet-1k dataset, pretrained on the ImageNet-21k, and evaluated on the public benchmark agriculture dataset DeepWeed, which contains plants grown with various weed species under different conditions. Our experimental results show that ViT-S16 outperforms state-of-the-art CNN models ResNeXt-50 and EfficientNet-B7, achieving an utmost accuracy of 98.54%. These results show how helpful ViT can be in different image classification tasks. In the future, researchers can look into the strengths and weaknesses of various ViT models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.