Abstract
Sea buckthorn is an extremely drought-tolerant, resilient and sustainable crop that can be grown in areas with harsh climates and scarce resources to provide a source of nutrition and income for the local population. The use of image-based yield estimation methods allows for better management of sea buckthorn cultivation to improve its productivity and sustainability, while the error in fruit yield information due to occlusion can be well reduced by combining and analysing the image features extracted using binocular cameras. In this paper, mature wild sea buckthorn in the mountainous areas north of Hohhot City, Inner Mongolia Autonomous Region, were used as the study target. Firstly, complete images of sea buckthorn branches were collected by binocular cameras and features were extracted. The extracted features include the colour index of sea buckthorn fruits, the number of fruits and a total of four texture parameters, ASM, CON, COR and HOM. The features with significant correlation to sea buckthorn fruit weight were selected by correlation calculation of the feature parameters, the obtained correlation features were introduced into the BP neural network model for training and then the sea buckthorn estimation model was obtained. The results showed that the best yield estimation model was achieved by combining the COR index with the colour index and the number of sea buckthorn fruits, with a coefficient of determination R2 = 0.99267 and a root mean square error RMSE = 0.5214.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.