Abstract

Artificial Intelligence (AI) has changed how processes are developed, and decisions are made in the agricultural area replacing manual and repetitive processes with automated and more efficient ones. This study presents the application of deep learning techniques to detect and segment weeds in agricultural crops by applying models with different architectures in the analysis of images captured by an Unmanned Aerial Vehicle (UAV). This study contributes to the computer vision field by comparing the performance of the You Only Look Once (YOLOv8n, YOLOv8s, YOLOv8m, and YOLOv8l), Mask R-CNN (with framework Detectron2), and U-Net models, making public the dataset with aerial images of soybeans and beans. The models were trained using a dataset consisting of 3021 images, randomly divided into test, validation, and training sets, which were annotated, resized, and increased using the Roboflow application interface. Evaluation metrics were used, which included training efficiency (mAP50 and mAP50-90), precision, accuracy, and recall in the model’s evaluation and comparison. The YOLOv8s variant achieved higher performance with an mAP50 of 97%, precision of 99.7%, and recall of 99% when compared to the other models. The data from this manuscript show that deep learning models can generate efficient results for automatic weed detection when trained with a well-labeled and large set. Furthermore, this study demonstrated the great potential of using advanced object segmentation algorithms in detecting weeds in soybean and bean crops.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.