Unmanned aerial vehicle (UAV) technologies have emerged as promising tools to improve forest ecosystem assessments. These technologies offer high-resolution data that can significantly enhance evaluations of forest structure, condition, and disturbance severity. UAV sensors such as LiDAR and multispectral provide complementary information about forest attributes, capturing structural and spectral details, yet their integration for comprehensive forest assessment remains understudied. In this paper, we explored the potential of combining UAV LiDAR and multispectral data to assess the disturbance severity of a West African forest patch (Benin). We developed an integrated disturbance index (IDI) that fuses structural properties from LiDAR data and spectral characteristics from multispectral vegetation indices through principal component analysis (PCA). This allowed us to delineate low (> 0.65), medium (0.35–0.65), and high (< 0.35) forest disturbance levels. We applied the IDI to the 560-ha Ewe-Adakplame relict forest in Benin, West Africa, and achieved 95 % overall accuracy in disturbance detection, outperforming both LiDAR-only (80 %) and multispectral-only (75 %) approaches. The IDI revealed that 23 % of the forest area has experienced low disturbance, while 28 % and 49 % face medium and high disturbance levels, respectively. These findings highlight that more than three-quarters of this relict forest is under considerable stress, underscoring the urgent need for tailored conservation strategies to strengthen forest resilience. This method's ability to differentiate disturbance levels can inform resource allocation, prioritize conservation efforts, and guide the development of site-specific management plans. The integration of UAV LiDAR and multispectral data demonstrated here has potential for application across diverse tropical forest patches, providing an effective means to monitor forest health, assess disturbance severity, and support data-driven decision-making in forest conservation and sustainable management.
Read full abstract