Abstract
Algorithms based on convolutional neural networks are the most efficient for semantic segmentation of images, including segmentation of forest cover disturbances from satellite images. In this study, we consider the applicability of various modifications of the U-net architecture of convolutional neural network for recognizing logged, burnt and windthrow areas in forests from multi-temporal and multi-seasonal Sentinel-2 satellite images. The assessment was carried out on three test sites that differ significantly in the characteristics of forest stands and forest management. The highest accuracy (average F-measure of 0.59) was obtained from the U-net model, while the models that showed the best results during training (Attention U-Net and MobilNetv2 U-Net) did not improve segmentation of independent data. The resulting accuracy estimates are close to those previously published for forests with a substantial proportion of selective logged areas. Characteristics of logged areas and windthrows, namely their area and type are the main factor determining the accuracy of semantic segmentation. Substantial differences were also revealed between the images taken in different seasons of the year, with the maximum segmentation accuracy based on winter pairs of images. According to summertime and different-season pairs of images, the area of forest disturbances is substantially underestimated. Forest species composition has a less significant effect, although for two of the three test sites, the maximum accuracy was observed in dark coniferous forests, and the minimum in deciduous forests. There was no statistically significant effect of slope lighting factor calculated from digital elevation model on segmentation accuracy based for winter pairs of images. The accuracy of segmentation of burnt areas, which was assessed using the example of 14 large forest fires in 2021-2022, is unsatisfactory, which is probably due to the varying degrees of damage to the forest cover in the burnt areas.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.