Abstract

In order to evaluate the effects of forest fires on the dynamics of the function and structure of ecosystems, it is necessary to determine burned forest areas with high accuracy, effectively, economically, and practically using satellite images. Extraction of burned forest areas utilizing high-resolution satellite images and image classification algorithms and assessing the successfulness of varied classification algorithms has become a prominent research field. This study aims to indicate on the capability of the deep learning-based Stacked Autoencoders method for the burned forest areas mapping from Sentinel-2 satellite images. The Stacked Autoencoders, used in this study as an unsupervised learning method, were compared qualitatively and quantitatively with frequently used supervised learning algorithms (k-Nearest Neighbors (k-NN), Subspaced k-NN, Support Vector Machines, Random Forest, Bagged Decision Tree, Naive Bayes, Linear Discriminant Analysis) on two distinct burnt forest zones. By selecting burned forest zones with contrasting structural characteristics from one another, an objective assessment was achieved. Manually digitized burned areas from Sentinel-2 satellite images were utilized for accuracy assessment. For comparison, different classification performance and quality metrics (Overall Accuracy, Mean Squared Error, Correlation Coefficient, Structural Similarity Index Measure, Peak Signal-to-Noise Ratio, Universal Image Quality Index, and KAPPA metrics) were used. In addition, whether the Stacked Autoencoders method produces consistent results was examined through boxplots. In terms of both quantitative and qualitative analysis, the Stacked Autoencoders method showed the highest accuracy values.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.