Abstract. It is critical to assess bushfire impact rapidly and accurately because bushfires play a significant role in forest degradation and present a threat to ecosystems and human lives. Over the past decades, several supervised algorithms of burn severity mapping have been proposed, facing the significant drawback of time-consuming labeling. Moreover, there is no robust framework for burn severity mapping through fusing multi-sensor, multi-resolution, and multi-temporal remote sensing imagery from satellite and aerial platforms. Therefore, this paper presents an unsupervised two-step pipeline: processing 2D data followed by 3D data for burn severity mapping, both of which are acquired from either aircraft or satellites. For the 2D data processing, our proposed unsupervised burned area detection (UsBA detection) model enhances burned area mapping accuracy by integrating Ultra-High Resolution (UHR) aerial imagery with bi-temporal medium-resolution PlanetScope imagery, using a Segment Anything Model (SAM)-assisted UNetFormer (pre-trained on the target-style public dataset – LoveDA Rural) for refinement. The model demonstrates superior burned area segmentation, evidenced by improved evaluation metrics calculated from labeled test sites. For the 3D analysis, the burned areas extracted from 2D processing are further assessed using pre- and post-event airborne laser data. We implement a voxel-based workflow, including necessary steps such as ground filtering through Superpoints in RANSAC Planes (SiRP) method and biomass change analysis. The results indicate that the 3D branch provides a reliable lower bound of the actual damage map, because the vegetation growth between two measurements remains, in essence, undetected. The proposed framework offers a more accurate and robust solution for burn severity mapping utilizing combined 2D and 3D data, evaluated on a multi-source dataset from a real bushfire event that occurred in Bushland Park, South Australia.
Read full abstract