Abstract

ABSTRACT Public release of the entire Landsat data archive and cloud-based geocomputation platforms have greatly facilitated land-cover mapping worldwide. The acquisition of training samples presents a significant challenge in mapping. In this study, we have developed an approach based on transfer learning to creatively use existing global land-cover products and training samples to resolve the training sample issue. We pretrained a deep neural network (DNN) model using low-quality samples automatically generated from existing global land-cover products. The pretrained model was then fine-tuned using high-quality training samples gathered from the training samples shared by authors of existing studies. We implemented this approach to generate a land-cover map for Beijing City, China, in 2015. Using the fine-tuned DNN model, we achieved an overall accuracy (OA) of 86.4% and a kappa coefficient of 0.796, based on independent validation samples. The accuracy was the same and even surpassed that of existing land-cover maps for Beijing. The fine-tuned DNN outperformed the random forest (RF) (OA = 70.2%, kappa = 0.588) and support vector machine (SVM) models (OA = 68.7%, kappa = 0.555) using the same training samples, showing that the need for training samples in land-cover classification can be met by combining deep learning models, existing global land-cover products, and recycled training samples.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.