Abstract
With an increase in the frequency and severity of wildfires across the globe and resultant changes to long-established fire regimes, the mapping of fire severity is a vital part of monitoring ecosystem resilience and recovery. The emergence of unoccupied aircraft systems (UAS) and compact sensors (RGB and LiDAR) provide new opportunities to map fire severity. This paper conducts a comparison of metrics derived from UAS Light Detecting and Ranging (LiDAR) point clouds and UAS image based products to classify fire severity. A workflow which derives novel metrics describing vegetation structure and fire severity from UAS remote sensing data is developed that fully utilises the vegetation information available in both data sources. UAS imagery and LiDAR data were captured pre- and post-fire over a 300 m by 300 m study area in Tasmania, Australia. The study area featured a vegetation gradient from sedgeland vegetation (e.g., button grass 0.2m) to forest (e.g., Eucalyptus obliqua and Eucalyptus globulus 50m). To classify the vegetation and fire severity, a comprehensive set of variables describing structural, textural and spectral characteristics were gathered using UAS images and UAS LiDAR datasets. A recursive feature elimination process was used to highlight the subsets of variables to be included in random forest classifiers. The classifier was then used to map vegetation and severity across the study area. The results indicate that UAS LiDAR provided similar overall accuracy to UAS image and combined (UAS LiDAR and UAS image predictor values) data streams to classify vegetation (UAS image: 80.6%; UAS LiDAR: 78.9%; and Combined: 83.1%) and severity in areas of forest (UAS image: 76.6%, UAS LiDAR: 74.5%; and Combined: 78.5%) and areas of sedgeland (UAS image: 72.4%; UAS LiDAR: 75.2%; and Combined: 76.6%). These results indicate that UAS SfM and LiDAR point clouds can be used to assess fire severity at very high spatial resolution.
Highlights
The correlation removal and Recursive Feature Elimination (RFE) approach resulted in eight predictor variables being used in the image-only stream, five predictor variables being used in the Light Detecting and Ranging (LiDAR)-only data streams and ten predictor variables used in the combined stream
A comparison was made to image-only and combined (UAS LiDAR and unoccupied aircraft systems (UAS) image predictor values) data streams with UAS LiDAR derived variables
Despite the feature selection process and subsequent accuracy analysis highlighting the similar capacity of each technology to classify fire severity, large differences in the information content indicate that the metrics derived for describing structural change in this study area were not suitable to represent the consumption of fine fuel
Summary
Many of the world’s ecosystems have co-evolved with specific regimes of fire [1,2,3,4], which includes the frequency, extent, season, intensity and subsequent severity of fire. Fire severity is a critical element of the fire regime because it can predicate the ecosystem response [5]. Fire severity was quantitatively defined by Keeley [6] as the change in vegetative biomass following fire. Measures of severity are informed by change indicators such as crown volume scorch, percentage fuel consumption and tree mortality [7,8,9,10,11]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.