Abstract
ABSTRACT Monitoring and mapping vegetation dynamics using remote sensing data are essential for our understanding of land surface processes. Most current satellite-based methods process vegetation index time-series data from a series of images to retrieve key points that correspond to vegetation phenophases. As deep learning approaches have been found to be powerful in processing individual images, we tested the applicability of convolutional neural network (CNN) in mapping vegetation growth days (VGD) and the start of growing season (SOS) from each Landsat image at fine-spatial-resolution. To provide references for both model training and testing, we applied the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) to fuse image pairs of Landsat 8 Operational Land Imager (OLI) and Moderate-resolution Imaging Spectroradiometer (MODIS) for four Landsat tiles in China. We then applied a first derivative method to retrieve VGD for the fused satellite data at fine-spatial-resolution. The CNN model was trained using each fused individual image as the model inputs and derived VGD as the targets. The trained model was further used to map VGD from individual Landsat image. The result could match the reference map well as indicated by the evaluation metrics. In terms of VGD, the method achieved a coefficient of determination of 0.85 and a root mean squared error of 8.17 days. In terms of SOS, the method achieved a coefficient of determination of 0.75 and a root mean squared error of 4.09 days. Compared with existing methods that require time series of satellite data spanning the entire growth cycles to retrieve phenological metrics, this study provides an alternative method to map VGD as well as SOS using individual Landsat image. Our study highlights the power of deep learning models in extracting phenological features from individual remote sensing images. Researchers can use our methods to predict near real-time VGD and SOS in the future.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.