Abstract
Polarimetric SAR images are a rich data source for crop mapping. However, quad-pol sensors have some limitations due to their complexity, increased data rate, and reduced coverage and revisit time. The main objective of this study was to evaluate the added value of quad-pol data in a multi-temporal crop classification framework based on SAR imagery. With this aim, three RADARSAT-2 scenes were acquired between May and June 2010. Once we analyzed the separability and the descriptive analysis of the features, an object-based supervised classification was performed using the Random Forests classification algorithm. Classification results obtained with dual-pol (VV-VH) data as input were compared to those using quad-pol data in different polarization bases (linear H-V, circular, and linear 45°), and also to configurations where several polarimetric features (Pauli and Cloude–Pottier decomposition features and co-pol coherence and phase difference) were added. Dual-pol data obtained satisfactory results, equal to those obtained with quad-pol data (in H-V basis) in terms of overall accuracy (0.79) and Kappa values (0.69). Quad-pol data in circular and linear 45° bases resulted in lower accuracies. The inclusion of polarimetric features, particularly co-pol coherence and phase difference, resulted in enhanced classification accuracies with an overall accuracy of 0.86 and Kappa of 0.79 in the best case, when all the polarimetric features were added. Improvements were also observed in the identification of some particular crops, but major crops like cereals, rapeseed, and sunflower already achieved a satisfactory accuracy with the VV-VH dual-pol configuration and obtained only minor improvements. Therefore, it can be concluded that C-band VV-VH dual-pol data is almost ready to be used operationally for crop mapping as long as at least three acquisitions in dates reflecting key growth stages representing typical phenology differences of the present crops are available. In the near future, issues regarding the classification of crops with small field sizes and heterogeneous cover (i.e., fallow and grasslands) need to be tackled to make this application fully operational.
Highlights
Crop classification is one of the major agricultural applications of remote sensing
Classification results obtained with dual-pol (VV-VH) data as input were compared to those using quad-pol data in different polarization bases, and to configurations where several polarimetric features (Pauli and Cloude–Pottier decomposition features and co-pol coherence and phase difference) were added
The results of this study demonstrate that C-band Synthetic Aperture Radar (SAR) data can be effectively used for crop classification
Summary
Crop classification is one of the major agricultural applications of remote sensing. Knowing the crop present on each agricultural field is a very valuable information at a range of scales. At the local and regional scales this information is a basic requirement to forecast yields and manage crop production [1], and to design agricultural policies and manage subsidies Typical approaches based on multispectral imagery rely on the spectral signature of crops [4]. This might be of limited use because several crops might have very similar spectral signatures, since these are mostly governed by the presence of pigments and the cellular structure of the mesophyll of leaves. Persistent cloud cover imposes serious limits to the viability of optical remote sensing based approaches in some regions of the world [5].
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.