Abstract

Accurate and timely crop mapping is essential for agricultural applications, and deep-learning methods have been applied on a range of remotely sensed data sources to classify crops. In this article, we develop a novel crop classification method based on spatiotemporal deep-learning fusion technology. However, for crop mapping, the selection and labeling of training samples is expensive and time consuming. Therefore, we propose a fully automated training-sample-selection method. First, we design the method according to image processing algorithms and the concept of a sliding window. Second, we develop the Geo-3D convolutional neural network (CNN) and Geo-Conv1D for crop classification using time-series Sentinel-2 imagery. Specifically, we integrate geographic information of crops into the structure of deep-learning networks. Finally, we apply an active learning strategy to integrate the classification advantages of Geo-3D CNN and Geo-Conv1D. Experiments conducted in Northeast China show that the proposed sampling method can reliably provide and label a large number of samples and achieve satisfactory results for different deep-learning networks. Based on the automatic selection and labeling of training samples, the crop classification method based on spatiotemporal deep-learning fusion technology can achieve the highest overall accuracy (OA) with approximately 92.50% as compared with Geo-Conv1D (91.89%) and Geo-3D CNN (91.27%) in the three study areas, indicating that the proposed method is effective and efficient in multi-temporal crop classification.

Highlights

  • A CCURATELY classifying crop type is important for both scientific and practical purposes in estimating crop yields, strengthening crop production management, and crop insurance [1]

  • We evaluated the crop classification accuracy of various methods on the test dataset and applied the following six parameters to evaluate the accuracy of crop recognition, including overall accuracy (OA), Kappa, precision, recall, F1 score, and intersection over union (IoU) [50]

  • We can see that the range of variation of the spectral information of rice for VARIgreen in July was significantly higher than that of other crops and urban areas, showing different spectral characteristics from other classes, and it was easier to extract the sample of rice through image processing

Read more

Summary

Introduction

A CCURATELY classifying crop type is important for both scientific and practical purposes in estimating crop yields, strengthening crop production management, and crop insurance [1]. One strategy is to use only spectral features, that is, the scene collected by a single satellite on a certain day during the crop growth period [4]. The other is to use the spectrum and time-series information of crops in one or more growth periods, which requires the collection of multi-scene satellite images [5]. The first strategy utilizes the unique spectral characteristics of different crops for classification. The second strategy makes full use of the spectrum and time-series information of the crop growth period, extracts features from the time-series data, and obtains useful information about the crop growth stage to improve the crop classification accuracy. Wardlow and Egbert [8] used time-series Moderate-Resolution Imaging Spectroradiometer (MODIS) normalized difference vegetation index (NDVI) data to classify Kansas crops using a decision tree classifier, and the overall accuracy (OA) achieved was 84% to 94%. The acquired temporal characteristics improve the accuracy of crop classification to a certain extent

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.