Abstract

Rice panicle density is one of the essential bases for the automatic speed regulation of unmanned harvesters, making density detection crucial for intelligent upgrades. Currently, existing methods for detecting rice panicle density do not meet actual harvesting scenarios and struggle to meet real-time requirements. To address this, we developed a real-time rice panicle density detection method for unmanned harvesters. This method includes a panicle detection model based on YOLOv5n (RP-YOLO) and a rice panicle density calculation based on coordinate transformations. RP-YOLO was optimized through various techniques, such as enhancing the target detection head, reconfiguring the backbone network and downsampling module, introducing an attention mechanism, and refining the loss function. Based on coordinate conversion, we converted the world coordinates of the detection frame vertex to image coordinates and calculated the panicle density. We established the RP-1668 dataset for japonica rice and trained and tested the model. Compared to the original YOLOv5n model, our modifications reduced floating-point operations per second (FLOPs) by 33.33 %, decreased model size by 31.90 %, increased detection speed by 12.63 %, and improved accuracy (AP0.5) by 3.82 % (AP0.5:0.95, 6.96 %). RP-YOLO achieved superior accuracy and detection speed compared to both conventional lightweight and non-lightweight models. In field applications, the error in density detection was less than 10 % compared to manual counting, and the results clearly reflected changes in rice panicle density. For a 1.4 m × 1.0 m rice field imaging area (with a resolution of 2560 × 1280), the method detects at 15 fps on an on-board industrial computer, providing reliable data support for adjusting the operating speed of driverless harvesters.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.