Abstract

The size of the rice seed is one of the most important traits affecting rice yield quality. Some drawbacks are inherent in the current automated seed measurement technology: First, the seed must be preprocessed prior to measurement. Secondly, a large overlap of seeds is not permitted. And thirdly, it is slow to take pictures using equipment such as scanners. To address these issues, we propose a rotational perception deep learning model YOLO-rot for automatic seed number and size measurement in this work. The YOLO-rot algorithm is divided into four major steps: (1) Image partition. (2) Full seed detection, where the data representation, the detection module, the IOU evaluation, the non-maximal suppression algorithm, and the loss function included in the YOLOv5 model are all improved. (3) Fuse the detection results. (4) Determine the actual length and width of each seed. We evaluated the impact of two different photographic methods and overlap rates on the detection accuracy of the YOLO-rot model. The results of the experiment indicated that mAP@0.5 reaches a maximum value of 0.92 when the number of overlapping pixels between subimages is 250. And the performance of YOLO-rot was compared with ImageJ, GrainScan, and GridFree. The MSE of length and width estimates was not greater than 0.11 using the YOLO-rot model for all 900 seeds. The promising results of this research have led to the development of an application tool for automatic seed measurement in real time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.