Abstract

Cross-calibration between sensors is necessary to bring measurements to a common radiometric scale; it allows a more complete monitoring of land surface processes and enhances data continuity and harmonization. However, differences in the Relative Spectral Response (RSR) of sensors generate uncertainties in the process [1]. For this reason, compensating for these differences is of great importance and can be achieved by using a spectral band adjustment factor (SBAF), which establishes a relationship between two spectrally adjusted bands. Nonetheless, this relationship has been shown to depend greatly on the surface type [2] and therefore needs to be corrected. In this work, we compute the SBAF between the historical Landsat and Sentinel 2 sensors by using the RSRs of different passive optical sensors in the Green, Red and NIR bands and the surface reflectance spectral libraries (ASTER, AVIRIS, IGCP) with a wide variety of classes. We produce a quadratic fit of the SBAF vs the surface's NDVI $(\rho_{nir}-\rho_{red})/(\rho_{nir}+\rho_{red})$ and propose an exponential correction equation dependent on the NDVI value for both bands. A comparison between Landsat 8 and Sentinel 2 images using the HLS product shows that this method improves the red band and NDVI accuracy by 46.4% and 63.9% respectively when the difference between the Relative Spectral Responses (RSR) is significant, but is inaccurate for the green band, where the atmospheric correction is likely to introduce same order errors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call