Abstract

In tomato producing fields, automated large-area detection method is critical for fruit health monitoring and harvesting. However, due to the limited feature information included in tomatoes, large-area detection across long distances results in more missing or incorrect detections. To address this issue, this research proposes an improved YOLOv8 network, RSR-YOLO, for long-distance identification of tomato fruits. Firstly, this paper designs a partial group convolution (PgConv) and furthermore an innovative FasterNet (IFN) module for feature extraction, taking into account the impact of split operations on the computational complexity of the backbone network. The IFN module is lightweight and efficient, which improves the detection accuracy and real-time detection performance of the model. Secondly, this research combines the Gather and Distribute mechanism (GD) and redesigns the feature fusion module to implement the extraction and fusion of various levels of tomato features, given the critical significance that low-dimensional features play in small target recognition and localization. Finally, Repulsion Loss is used in this paper to examine the impact of fruit overlap and leaf occlusion on detection outcomes. RSR-YOLO achieves precision, recall, F1 score, and mean average precision (mAP@0.5) of 91.6%, 85.9%, 88.7%, and 90.7%, respectively, marking increases of 4.2%, 4%, 4.2%, and 3.6% compared to YOLOv8n. In addition, this paper designs a specialized Graphical User Interface (GUI) for the real-time detection task of tomatoes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call