Abstract
In recent decades, high-resolution (HR) remote sensing images have shown considerable potential for providing detailed information for change detection. The traditional change detection methods based on HR remote sensing images mostly only detect a single land type or only the change range, and cannot simultaneously detect the change of all object types and pixel-level range changes in the area. To overcome this difficulty, we propose a new coarse-to-fine deep learning-based land-use change detection method. We independently created a new scene classification dataset called NS-55, and innovatively considered the adaptation relationship between the convolutional neural network (CNN) and the scene complexity by selecting the CNN that best fit the scene complexity. The CNN trained by NS-55 was used to detect the category of the scene, define the final category of the scene according to the majority voting method, and obtain the changed scene by comparison to obtain the so-called coarse change result. Then, we created a multi-scale threshold (MST) method, which is a new method for obtaining high-quality training samples. We used the high-quality samples selected by MST to train the deep belief network to obtain the pixel-level range change detection results. By mapping coarse scene changes to range changes, we could obtain fine multi-type land-use change detection results. Experiments were conducted on the Multi-temporal Scene Wuhan dataset and aerial images of a particular area of Dapeng New District, Shenzhen, where promising results were achieved by the proposed method. This demonstrates that the proposed method is practical, easy-to-implement, and the NS-55 dataset is physically justified. The proposed method has the potential to be applied in the large scale land use fine change detection problem and qualitative and quantitative research on land use/cover change based on HR remote sensing data.
Highlights
Change detection is the process of comparing objects, scenes, or phenomena in different time dimensions to identify their state differences [1,2]
In order to obtain fine change detection results and extract information on various types of change objects and change positions in the study area, we propose a land-use change detection method for HR remote sensing images based on coarse to fine deep learning
To accelerate the model convergence speed and improve the accuracy of model training, combined with the idea of transfer learning [64,65,66], the AlexNet, ResNet50, ResNet152, and DenseNet169 used in this paper were pre-trained models, that is, a network pre-trained on the ImageNet dataset of 1000 classes
Summary
Change detection is the process of comparing objects, scenes, or phenomena in different time dimensions to identify their state differences [1,2]. In order to obtain fine change detection results and extract information on various types of change objects and change positions in the study area, we propose a land-use change detection method for HR remote sensing images based on coarse to fine deep learning. Depending on the changing scene, we used a multi-scale threshold (MST) method to obtain high-quality changed and unchanged training samples by dividing the spectrum and texture change intensity values Using these changed and unchanged samples as the training samples to train the deep belief network (DBN), a DBN can detect the image and obtain pixel-level range detection results. (1) The new idea of detecting a land use/cover change from coarse to fine based on deep learning is proposed, which solves the problem that the traditional method detects a single type of object or only detects the change range. We provide a new method for selecting training samples for the field of remote sensing image processing using deep learning, which can improve the efficiency of labelling samples and reduce the tedious work of manually labelling samples
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.