Abstract

High-resolution deep-learning-based remote-sensing imagery analysis has been widely used in land-use and crop-classification mapping. However, the influence of composite feature bands, including complex feature indices arising from different sensors on the backbone, patch size, and predictions in transferable deep models require further testing. The experiments were conducted in six sites in Henan province from 2019 to 2021. This study sought to enable the transfer of classification models across regions and years for Sentinel-2A (10-m resolution) and Gaofen PMS (2-m resolution) imagery. With feature selection and up-sampling of small samples, the performance of UNet++ architecture on five backbones and four patch sizes was examined. Joint loss, mean Intersection over Union (mIoU), and epoch time were analyzed, and the optimal backbone and patch size for both sensors were Timm-RegNetY-320 and 256 × 256, respectively. The overall accuracy and F1 scores of the Sentinel-2A predictions ranged from 96.86% to 97.72% and 71.29% to 80.75%, respectively, compared to 75.34%–97.72% and 54.89%–73.25% for the Gaofen predictions. The accuracies of each site indicated that patch size exerted a greater influence on model performance than the backbone. The feature-selection-based predictions with UNet++ architecture and up-sampling of minor classes demonstrated the capabilities of deep-learning generalization for classifying complex ground objects, offering improved performance compared to the UNet, Deeplab V3+, Random Forest, and Object-Oriented Classification models. In addition to the overall accuracy, confusion matrices, precision, recall, and F1 scores should be evaluated for minor land-cover types. This study contributes to large-scale, dynamic, and near-real-time land-use and crop mapping by integrating deep learning and multi-source remote-sensing imagery

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.