Abstract
Farmland parcel-based crop classification using satellite data plays an important role in precision agriculture. In this study, a deep-learning-based time-series analysis method employing optical images and synthetic aperture radar (SAR) data is presented for crop classification for cloudy and rainy regions. Central to this method is the spatial-temporal incorporation of high-resolution optical images and multi-temporal SAR data and deep-learning-based time-series analysis. First, a precise farmland parcel map is delineated from high-resolution optical images. Second, pre-processed SAR intensity images are overlaid onto the parcel map to construct time series of crop growth for each parcel. Third, a deep-learning-based (using the long short-term memory, LSTM, network) classifier is employed to learn time-series features of crops and to classify parcels to produce a final classification map. The method was applied to two datasets of high-resolution ZY-3 images and multi-temporal Sentinel-1A SAR data to classify crop types in Hunan and Guizhou of China. The classification results, with an 5.0% improvement in overall accuracy compared to those of traditional methods, illustrate the effectiveness of the proposed framework for parcel-based crop classification for southern China. A further analysis of the relationship between crop calendars and change patterns of time-series intensity indicates that the LSTM model could learn and extract useful features for time-series crop classification.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.