Abstract

<p>Monitoring the terrestrial photosynthetic capacity is vital for understanding ecological processes and modelling the responses of vegetated ecosystems to diverse environmental changes. Among multiple instruments foreseen to collect data over global terrestrial landscapes in the near future, the "FLuorescence EXplorer" (FLEX) mission of the European Space Agency (ESA) is planned to be launched by 2024. FLEX will be dedicated to vegetation fluorescence measurements and will partner with the operational Sentinel-3 (S3) in a tandem mission. Thanks to the emergence of cloud-computing platforms, such as Google Earth Engine (GEE), and the ability of machine learning (ML) methods to efficiently solve prediction problems, a shift of paradigm moving away from traditional image analysis to independent cloud-based processing can be observed. Therefore, we present a workflow to automate the spatiotemporal mapping of essential vegetation traits from S3 imagery in GEE, including leaf chlorophyll content (LCC), leaf area index (LAI), fraction of absorbed photosynthetically active radiation (FAPAR), and fractional vegetation cover (FVC). The retrieval strategy involved Gaussian process regression (GPR) algorithms trained on top-of-atmosphere (TOA) radiance simulated by the coupled canopy radiative transfer model (RTM) Soil Canopy Observation, Photochemistry and Energy fluxes (SCOPE) and the atmospheric RTM Second Simulation of a Satellite Signal in the Solar Spectrum-vector (6SV). This approach takes advantage of the physical principles of RTMs with the computational performance of ML. The established S3 TOA-GPR 1.0 retrieval models were directly implemented in GEE to quantify the traits from TOA data as acquired from the S3 Ocean and Land Colour Instrument (OLCI) sensor. Theoretical validation provided good to high accuracy with normalized root mean square error (NRMSE) ranging from 5% (FAPAR) to 19% (LAI). Subsequently, a three-fold evaluation approach was pursued at diverse sites and land cover types: (1) temporal comparison against LAI and FAPAR products obtained from Moderate Resolution Imaging Spectroradiometer (MODIS) for the time window 2016-2020, (2) spatial difference mapping with Copernicus Global Land Service (CGLS) estimates, and (3) direct validation using interpolated in-situ data from the VALERI campaigns. Validation against these three data sets achieved promising results. For the MODIS FAPAR product, selected sites demonstrated coherent seasonal patterns, with spatially-averaged mean differences of only 7%. With respect to spatial mapping comparison, estimates provided by the S3 TOA-GPR 1.0 models indicated the highest consistency with FVC and FAPAR CGLS products, with absolute deviations of retrievals below 0.3. Moreover, the direct validation of our S3 TOA-GPR 1.0 models against VALERI estimates indicated good retrieval performance for LAI, FAPAR and FVC. With these promising results, our proposed retrieval workflow opens the path towards usage and optimisation of continental-to-global monitoring of fundamental vegetation traits in GEE, accessible to the whole research community. Eventually, observations of these vegetation traits can be assimilated into terrestrial biosphere models for estimating global gross primary productivity and carbon fluxes. Consequently, once FLEX is launched, the presented S3 TOA-GPR 1.0 retrieval models are expected to contribute to process-based assimilation models aiming to quantify actual terrestrial photosynthetic activity from future S3-FLEX mission data. </p>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.