Abstract

The number of flowers is essential for evaluating the growth status of litchi trees and enables researchers to estimate flowering rates and conduct various phenotypic studies, particularly focusing on the information of individual panicles. However, manual counting remains the primary method for quantifying flowers, and there has been insufficient emphasis on the advancement of reliable deep learning methods for estimation and their integration into research. Furthermore, the current density map-based methods are susceptible to background interference. To tackle the challenges of accurately quantifying small and dense male litchi flowers, a framework counting the flowers in panicles is proposed. Firstly, an existing effective algorithm YOLACT++ is utilized to segment individual panicles from images. Secondly, a novel algorithm FlowerNet based on density map regression is proposed to accurately count flowers in each panicle. By employing a multitask learning approach, FlowerNet effectively captures both foreground and background information, thereby overcoming interference from non-target areas during pixel-level regression tasks. It achieves a mean absolute error of 47.71 and a root mean squared error of 61.78 on the flower dataset constructed. Additionally, a regression equation is established using a dataset of inflorescences to examine the application of the algorithm for flower counting. It captures the relationship between the predicted number of flowers by FlowerNet and the manually counted number, resulting in a determination coefficient ( R 2 ) of 0.81. The proposed algorithm shows promise for automated estimation of litchi flowering quantity and can serve as a valuable reference for litchi orchard management during flowering period.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.