Abstract

The risk and damage of wildfires have been increasing due to various reasons including climate change, and the Republic of Korea is no exception to this situation. Burned area mapping is crucial not only to prevent further damage but also to manage burned areas. Burned area mapping using satellite data, however, has been limited by the spatial and temporal resolution of satellite data and classification accuracy. This article presents a new burned area mapping method, by which damaged areas can be mapped using semantic segmentation. For this research, PlanetScope imagery that has high-resolution images with very short revisit time was used, and the proposed method is based on U-Net which requires a unitemporal PlanetScope image. The network was trained using 17 satellite images for 12 forest fires and corresponding label images that were obtained semiautomatically by setting threshold values. Band combination tests were conducted to produce an optimal burned area mapping model. The results demonstrated that the optimal and most stable band combination is red, green, blue, and near infrared of PlanetScope. To improve classification accuracy, Normalized Difference Vegetation Index, dissimilarity extracted from Gray-Level Co-Occurrence Matrix, and Land Cover Maps were used as additional datasets. In addition, topographic normalization was conducted to improve model performance and classification accuracy by reducing shadow effects. The F1 scores and overall accuracies of the final image segmentation models are ranged from 0.883 to 0.939, and from 0.990 to 0.997, respectively. These results highlight the potential of detecting burned areas using the deep learning based approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.