Abstract
In this study, an algorithm is implemented with a computer vision model to detect and classify coffee fruits and map the fruits maturation stage during harvest. The main contribution of this study is with respect to the assignment of geographic coordinates to each frame, which enables the mapping of detection summaries across coffee rows. The model used to detect and classify coffee fruits was implemented using the Darknet, an open source framework for neural networks written in C. The coffee fruits detection and classification were performed using the object detection system named YOLOv3-tiny. For this study, 90 videos were recorded at the end of the discharge conveyor of a coffee harvester during the 2020 harvest of arabica coffee (Catuaí 144) at a commercial area in the region of Patos de Minas, in the state of Minas Gerais, Brazil. The model performance peaked around the ~3300th iteration when considering an image input resolution of 800 × 800 pixels. The model presented an mAP of 84%, F1-Score of 82%, precision of 83%, and recall of 82% for the validation set. The average precision for the classes of unripe, ripe, and overripe coffee fruits was 86%, 85%, and 80%, respectively. As the algorithm enabled the detection and classification in videos collected during the harvest, it was possible to map the qualitative attributes regarding the coffee maturation stage along the crop lines. These attribute maps provide managers important spatial information for the application of precision agriculture techniques in crop management. Additionally, this study should incentive future research to customize the deep learning model for certain tasks in agriculture and precision agriculture.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.