Abstract

Crop type identification is the initial stage and an important part of the agricultural monitoring system. It is well known that synthetic aperture radar (SAR) Sentinel-1A imagery provides a reliable data source for crop type identification. However, a single-temporal SAR image does not contain enough features, and the unique physical characteristics of radar images are relatively lacking, which limits its potential in crop mapping. In addition, current methods may not be applicable for time-series SAR data. To address the above issues, a new crop type identification method was proposed. Specifically, a farmland mask was firstly generated by the object Markov random field (OMRF) model to remove the interference of non-farmland factors. Then, the features of the standard backscatter coefficient, Sigma-naught (σ0), and the normalized backscatter coefficient by the incident angle, Gamma-naught (γ0), were extracted for each type of crop, and the optimal feature combination was found from time-series SAR images by means of Jeffries-Matusita (J-M) distance analysis. Finally, to make efficient utilization of optimal multi-temporal feature combination, a new network, the convolutional-autoencoder neural network (C-AENN), was developed for the crop type identification task. In order to prove the effectiveness of the method, several classical machine learning methods such as support vector machine (SVM), random forest (RF), etc., and deep learning methods such as one dimensional convolutional neural network (1D-CNN) and stacked auto-encoder (SAE), etc., were used for comparison. In terms of quantitative assessment, the proposed method achieved the highest accuracy, with a macro-F1 score of 0.9825, an overall accuracy (OA) score of 0.9794, and a Kappa coefficient (Kappa) score of 0.9705. In terms of qualitative assessment, four typical regions were chosen for intuitive comparison with the sample maps, and the identification result covering the study area was compared with a contemporaneous optical image, which indicated the high accuracy of the proposed method. In short, this study enables the effective identification of crop types, which demonstrates the importance of multi-temporal radar images in feature combination and the necessity of deep learning networks to extract complex features.

Highlights

  • With the intensive development of agricultural production modes in China, smart agriculture has emerged [1]

  • This study aims to develop a deep learning method to improve the accuracy of crop type identification using Sentinel-1A synthetic aperture radar (SAR) images

  • This paper demonstrated that the convolutional-autoencoder neural network (C-AENN) classifier outperformed the classical machine learning methods support vector machine (SVM) and random forest (RF), etc., and the neural network methods

Read more

Summary

Introduction

With the intensive development of agricultural production modes in China, smart agriculture has emerged [1]. There is an urgent need for large-scale and efficient monitoring of crops [2]. Due to remote sensing technology having the benefit of objectivity and economy, its application in the agricultural field is continuously expanding and deepening [3,4]. Agricultural remote sensing applications consist of crop type identification [5], yield. 2022, 14, 1379 estimation [6], soil moisture inversion [7], growth and phenological phase monitoring [8,9], etc. Crop type identification is a pre-requisite work for national government departments to grasp the status of crop production, which is of great significance to agricultural management. Crop type identification by remote sensing is usually carried out by visual interpretation [10] or computer techniques (supervised and unsupervised classification methods [11])

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.