Acquiring phenological event data is crucial for studying the impacts of climate change on forest dynamics and assessing the risks associated with the early onset of young leaves. Large-scale mapping of forest phenological timing using Earth observation (EO) data could enhance our understanding of these processes through an added spatial component. However, translating traditional ground-based phenological observations into reliable ground truthing for training and validating EO mapping applications remains challenging. This study explored the feasibility of predicting high-resolution phenological phase data for European beech (Fagus sylvatica) using unoccupied aerial vehicle (UAV)-based multispectral indices and machine learning. Employing a comprehensive feature selection process, we identified the most effective sensors, vegetation indices, training data partitions, and machine learning models for phenological phase prediction. The model that performed best and generalized well across various sites utilized Green Chromatic Coordinate (GCC) and Generalized Additive Model (GAM) boosting. The GCC training data, derived from the radiometrically calibrated visual bands of a multispectral sensor, were predicted using uncalibrated RGB sensor data. The final GCC/GAM boosting model demonstrated capability in predicting phenological phases on unseen datasets within a root mean squared error threshold of 0.5. This research highlights the potential interoperability among common UAV-mounted sensors, particularly the utility of readily available, low-cost RGB sensors. However, considerable limitations were observed with indices that implement the near-infrared band due to oversaturation. Future work will focus on adapting models to better align with the ICP Forests phenological flushing stages.
Read full abstract