Abstract
The SMOS (Soil Moisture and Ocean Salinity) Mission was selected in May 1999 by the European Space Agency to provide global and frequent soil moisture and sea surface salinity maps. SMOS single payload is MIRAS (Microwave Imaging Radiometer by Aperture Synthesis), an L-band 2D aperture synthesis interferometric radiometer with multi-angular observation capabilities and dual-polarization and full-polarimetric capabilities. The impact of thermal drifts in the SMOS rms radiometric accuracy during a complete orbit is evaluated by means of the SMOS End-to-end Performance Simulator (SEPS) [1,2,3] for three different cases of interest: the so-called cold, hot, and LAP (Low Available Power) cases, that correspond to the extreme thermal conditions in a year, and the case where there is not enough power to keep the thermal control. Errors originated by receivers' frequency response mismatch, and their thermal drifts are included, while antenna voltage patterns errors are assumed to be the same, and not temperature-dependent. Current error correction and image reconstruction algorithms are applied to obtain the synthetic brightness temperatures. To avoid scene-dependent effects on the rms radiometric accuracy drifts, it is computed using a 150 K constant brightness temperature in the directions occupied by the Earth, and external sources such as the Sun, the Moon and the sky (cosmic and galactic) radiations are not modeled by switching them off in SEPS. The inter-calibration period is estimated according to the maximum error drift that can be accepted in the different cases
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have