Abstract

Deep learning is increasingly popular for precise plant detection and counting in computer vision applications. Despite the rising cross-disciplinary research in this field, an important issue has been neglected: due to the inherent constraints imposed by the natural growth patterns of plants, the costly and time-consuming process of collecting plant image data, resulting in a scarcity of real-world available data. Consequently, researchers are often constrained to exploring large-scale data representations or limited applications. Currently, still no generalized network architecture excels in demanding environments. To address this, we introduce two new bi-directional cascade neural network paradigms, PlantBiCNet and its lightweight variant, PlantBiCNet-Lite. Unlike previous research, we place greater emphasis on the network decoding process itself rather than the encoder. Our bi-directional cascade and weight fusion decoding approach maximizes high-level semantics and low-level spatial data, enhancing the localization signal in a way that respects gradient diversity. We evaluate PlantBiCNet and PlantBiCNet-Lite on five public plant detection and counting datasets and a novel plant dataset. The average precision AP50 on these six small-scale datasets reached 0.905, with a high coefficient of determination R2 of 0.935, while the reported frame rate was 196 FPS. These results significantly outperform state-of-the-art computer vision methods. Moreover, we found that the introduction of the Dropout mechanism markedly enhances the model's performance and generalization ability. Further analysis supports our experiments and provides valuable insights into network optimization. Overall, our organization's open datasets and network modeling details provide a solid foundation for future research in the field of plant science.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.