Abstract
Remote sensing technology plays an important role in woodland identification. However, in mountainous areas with complex terrain, accurate extraction of woodland boundary information still faces challenges. To address this problem, this paper proposes a multiple mixed attention U-Net (MMA-U-Net) semantic segmentation model using 2015 and 2022 GF-1 PMS images as data sources to improve the ability to extract the boundary features of Picea schrenkiana var. tianschanica forest. The U-Net architecture serves as its underlying network, and the feature extraction ability of the Picea schrenkiana var. tianschanica is improved by adding hybrid attention CBAM and replacing the original skip connection with the DCA module to improve the accuracy of the model segmentation. The results show that on the remote sensing dataset with GF-1 PMS images, compared with the original U-Net and other models, the accuracy of the multiple mixed attention U-Net model is increased by 5.42%–19.84%. By statistically analyzing the spatial distribution of Picea schrenkiana var. tianschanica as well as their changes, the area was 3471.38 km2 in 2015 and 3726.10 km2 in 2022. Combining the predicted results with the DEM data, it was found that the Picea schrenkiana var. tianschanica were most distributed at an altitude of 1700–2500 m. The method proposed in this study can accurately identify Picea schrenkiana var. tianschanica and provides a theoretical basis and research direction for forest monitoring.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.