Abstract
Identifying rice cultivation areas in a timely and accurate manner holds great significance in comprehending the overall distribution pattern of rice and formulating agricultural policies. The remote sensing observation technique provides a convenient means to monitor the distribution of rice cultivation areas on a large scale. Single-source or single-temporal remote sensing images are often used in many studies, which makes the information of rice in different types of images and different growth stages hard to be utilized, leading to unsatisfactory identification results. This paper presents a rice cultivation area identification method based on a deep learning model using multi-source and multi-temporal remote sensing images. Specifically, a U-Net based model is employed to identify the rice planting areas using both the Landsat-8 optical dataset and Sentinel-1 Polarimetric Synthetic Aperture Radar (PolSAR) dataset; to take full into account of the spectral reflectance traits and polarimetric scattering traits of rice in different periods, multiple image features from multi-temporal Landsat-8 and Sentinel-1 images are fed into the network to train the model. The experimental results on China's Sanjiang Plain demonstrate the high classification precisions of the proposed Multi-Source and Multi-Temporal Rice Identification U-Net (MSMTRIU-NET) and that inputting more information from multi-source and multi-temporal images into the network can indeed improve the classification performance; further, the classification map exhibits greater continuity, and the demarcations between rice cultivation regions and surrounding environments reflect reality more accurately.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.