Abstract

Despite the popularity and success in burned area detection and assessment, multispectral satellite images are often affected by poor sunlight-illumination conditions, particularly at high latitudes. Given that Synthetic Aperture Radar (SAR) can effectively penetrate clouds and collect images in all-weather conditions during day and night, the complementary use of optical and SAR data can be helpful for remote-sensing measurements and assessments of burned sites. Nevertheless, the widely used burn-sensitive spectral indices hardly help analyze SAR data due to the inherent difference between optical and SAR sensors in physical imaging mechanisms. In this study, we aim to leverage multi-source data for burned area mapping and burn severity assessment by translating SAR images into optical images using ResNet-based Pix2Pix model. Experiments were performed on 8669 pairs of bitemporal Sentinel-1 SAR and Sentinel-2 optical patches over 304 large wildfire events in Canada with a wide range of land covers. The translated optical images from SAR data occupied similar spectral properties to real optical observations and the corresponding generated spectral indices (i.e., delta Normalized Burn Ratio (dNBR), relative dNBR, and Relativized Burn Ratio) also showed high agreement with real optical indices. In terms of burned area detection using the generated indices, their medium values of the area under the receiver operating characteristics curve (AUC) were over 85%, achieving promising performance that outperformed the SAR-based index. Burn severity maps derived from multi-source data achieved a relatively high Kappa coefficient of 0.77. Results showed the feasibility and effectiveness of GAN-based SAR-to-optical translation for wildfire impact assessment, paving the way for the combined use of optical and SAR data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call