Abstract

In the context of climate change, extreme weather events, represented by frost injury, are increasingly having a negative impact on the growth of tea plants. This has brought huge losses to the tea industry. Traditionally, the freezing injury of tea plants in the field was assessed by vision. This is labor-intensive and subjective. In this research, multimodal remote sensing data from different periods of natural overwintering tea plantations were collected by using unmanned aerial vehicles (UAV) equipped with multispectral (MS), thermal infrared (TIR) and RGB sensors. And the physiological data of tea leaves on the same day were obtained to construct a tea cold injury score (TCIS). Then, a convolutional neural networks-gate recurrent unit (CNN-GRU) model was improved for estimating TCIS. To better compare the performance of CNN-GRU, a single GRU model and three classical machine learning models were also used for comparison. The study found that: (1) The multimodal data fusion was superior to the unimodal data. The best prediction results were achieved for the combined bimodal MS + RGB data (Rp2 = 0.862, RMSEP = 0.138, RPD = 2.220); (2) The CNN-GRU hybrid model was superior to the other four baseline models. The best effect was achieved based on the multivariate input of MS + RGB (Rp2 = 0.862) or MS + RGB + TIR (Rp2 = 0.850); (3) The accuracy of the model after removing soil features was lower than that of the model without background removal. Therefore, the TCIS-CNN-GRU model combined with multi-source remote sensing data can objectively and accurately evaluate the cold injury phenotype of tea plants, making the CNN-GRU model more scientific and promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call