Abstract

Accurate spatiotemporal information about crop progress during the growing season is critical for crop yield estimation. Crop progress monitoring at field scale requires high resolution remote sensing data in both time and space. Remote sensing data from a single sensor cannot satisfy the requirement at present. Data fusion approach has been developed to fuse remote sensing imagery from Landsat and MODIS instruments. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is one of the most popular spatial and temporal data fusion algorithms and has been applied in many applications. The data fusion accuracy was evaluated for many sites. Previous studies found that the accuracy of data fusion results depended on the pair images used. In this study, several Landsat-8 reflectance images (path28/row31) in 2015 were selected as pair images to evaluate the data fusion accuracy. Results were assessed based on the observed Landsat data that have not been used as pair images due to partial cloud coverage or image gaps. Several statistic metrics, including average absolute difference, root mean square error, correlation coefficient, and the spectral angle mapper, were calculated to assess the data fusion results. The initial results show that the predictability of each images pair at different dates is different. Closer dates have better prediction accuracy as expected. Interestingly, the different crop type (corn and soybeans) shows different data fusion accuracies even using same image pair. This study suggests that data fusion results could be further improved if an appropriate image pair is selected. Accurate dense time-series data at Landsat resolution will enhance our ability in crop condition monitoring and crop yield estimation at field scale.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call