Abstract

In this study, we introduce an improved semi-supervised deep learning approach, and demonstrate its suitability for modeling the relationship between forest structural parameters and satellite remote sensing imagery and producing forest maps. The improved approach is based on a popular UNet model, modified and fine-tuned to improve the forest parameter prediction performance. Within the improved model, squeeze-and-excitation blocks are embedded to re-calibrate the multi-source features via retrieved channel-wise self-attention and a novel Cross-Pseudo Regression strategy is implemented to train the model in a semi-supervised way. The improvement imposes consistency learning on two perturbed network branches, generating regression pseudo-reference and expanding the dataset size. For demonstration, we used satellite synthetic aperture radar (SAR) Sentinel-1 and multispectral optical Sentinel-2 images as remote sensing data, complemented with reference data represented by forest tree height as one of the key forest structural variables. The study area is located in a boreal forestland in Central Finland. Proposed approach showed larger accuracy compared to traditional machine learning methods such as random forests and boosting trees, and baseline UNet model. Best accuracy figures for forest tree height were achieved with combined SAR and optical imagery and were as small as 24.1 % RMSE (root mean square error) on pixel-level and 15.4 % RMSE on forest stand level.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.