Abstract

The direct effect of global warming on viticulture is already apparent, with unexpected pests and diseases as one of the most concerning consequences. Deploying sticky traps on grape plantations to attract key insects has been the backbone of conventional pest management programs. However, they are time-consuming processes for winegrowers, conducted through visual inspection via the manual identification and counting of key insects. Additionally, winegrowers usually lack taxonomy expertise for accurate species identification. This paper explores the usage of deep learning on the edge to identify and quantify pest counts automatically. Different mobile devices were used to acquire a dataset of yellow sticky and delta traps, consisting of 168 images with 8966 key insects manually annotated by experienced taxonomy specialists. Five different deep learning models suitable to run locally on mobile devices were selected, trained, and benchmarked to detect five different insect species. Model-centric, data-centric, and deployment-centric strategies were explored to improve and fine-tune the considered models, where they were tested on low-end and high-end mobile devices. The SSD ResNet50 model proved to be the most suitable architecture for deployment on edge devices, with accuracies per class ranging from 82% to 99%, the F1 score ranging from 58% to 84%, and inference speeds per trap image of 19.4 s and 62.7 s for high-end and low-end smartphones, respectively. These results demonstrate the potential of the approach proposed to be integrated into a mobile-based solution for vineyard pest monitoring by providing automated detection and the counting of key vector insects to winegrowers and taxonomy specialists.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.