Abstract

Automated monitoring of the rice leaf area index (LAI) using near-ground sensing platforms, such as inspection robots, is essential for modern rice precision management. These robots are equipped with various complementary sensors, where specific sensor capabilities partially overlap to provide redundancy and enhanced reliability. Thus, leveraging multi-sensor fusion technology to improve the accuracy of LAI monitoring has become a crucial research focus. This study presents a rice LAI monitoring model based on the fused data from RGB and multi-spectral cameras with an ensemble learning algorithm. The results indicate that the estimation accuracy of the rice LAI monitoring model is effectively improved by fusing the vegetation index and textures from RGB and multi-spectral sensors. The model based on the LightGBM regression algorithm has the most improvement in accuracy, with a coefficient of determination (R2) of 0.892, a root mean square error (RMSE) of 0.270, and a mean absolute error (MAE) of 0.160. Furthermore, the accuracy of LAI estimation in the jointing stage is higher than in the heading stage. At the jointing stage, both LightGBM based on optimal RGB image features and Random Forest based on fused features achieved an R2 of 0.95. This study provides a technical reference for automatically monitoring rice growth parameters in the field using inspection robots.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.