Abstract

Wheat is one of the widely cultivated crops. Accurate and efficient high-throughput ear counting is important for wheat production, yield evaluation, and seed breeding. The traditional wheat ear counting method is inefficient due to the small scope of investigation. Especially in the wheat field scene, the images obtained from different platforms, including ground systems and unmanned aerial vehicles (UAVs), have differences in density, scale, and wheat ear distribution, which makes the wheat ear counting task still face some challenges. To this end, a density map counting network (LWDNet) model was constructed for cross-platform wheat ear statistics. Firstly, CA-MobileNetV3 was constructed by introducing a collaborative attention mechanism (CA) to optimize the lightweight neural network MobileNetV3, which was used as the front end of the feature extraction network, aiming to solve the problem of occlusion and adhesion of wheat ears in the field. Secondly, to enhance the model’s ability to learn the detailed features of wheat ears, the CARAFE upsampling module was introduced in the feature fusion layer to better restore the characteristics of wheat ears and improve the counting accuracy of the model for wheat ears. Finally, density map regression was used to achieve high-density, small-target ear counting, and the model was tested on datasets from different platforms. The results showed that our method can efficiently count wheat ears of different spatial scales, achieving good accuracy while maintaining a competitive number of parameters (2.38 million with a size of 9.24 MB), which will benefit wheat breeding and screening analysis to provide technical support.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.