Abstract
Thousand-grain weight is the main parameter for accurately estimating rice yields, and it is an important indicator for variety breeding and cultivation management. The accurate detection and counting of rice grains is an important prerequisite for thousand-grain weight measurements. However, because rice grains are small targets with high overall similarity and different degrees of adhesion, there are still considerable challenges preventing the accurate detection and counting of rice grains during thousand-grain weight measurements. A deep learning model based on a transformer encoder and coordinate attention module was, therefore, designed for detecting and counting rice grains, and named TCLE-YOLO in which YOLOv5 was used as the backbone network. Specifically, to improve the feature representation of the model for small target regions, a coordinate attention (CA) module was introduced into the backbone module of YOLOv5. In addition, another detection head for small targets was designed based on a low-level, high-resolution feature map, and the transformer encoder was applied to the neck module to expand the receptive field of the network and enhance the extraction of key feature of detected targets. This enabled our additional detection head to be more sensitive to rice grains, especially heavily adhesive grains. Finally, EIoU loss was used to further improve accuracy. The experimental results show that, when applied to the self-built rice grain dataset, the precision, recall, and mAP@0.5 of the TCLE-YOLO model were 99.20%, 99.10%, and 99.20%, respectively. Compared with several state-of-the-art models, the proposed TCLE-YOLO model achieves better detection performance. In summary, the rice grain detection method built in this study is suitable for rice grain recognition and counting, and it can provide guidance for accurate thousand-grain weight measurements and the effective evaluation of rice breeding.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.