Abstract

The Local Oscillators (LO) of the Microwave Imaging Radiometer using Aperture Synthesis (MIRAS) onboard the Soil Moisture and Ocean Salinity (SMOS) satellite are used to maintain the operating frequency of the 69 receivers. The phase of the LO drifts over time, in turn blurring the MIRAS brightness temperature (TB) measurements. After a pre-launch assessment, it was decided to calibrate the LO every 10 minutes to reduce the phase drifts. During short periods of the first 2.5 years of SMOS mission, the LO calibration has been performed every 2 minutes to assess the impact of a higher calibration frequency on the quality of the data. In this study, relative differences (10-min TBs versus 2-min TBs) of about 0.3 K are shown, which lead to non-negligible relative differences of about 0.2–0.3 practical salinity units (psu) in the retrieved sea surface salinity (SSS). However, when performing independent validation against Argo float SSS data at Level 3 (spatio-temporally averaged SSS products), no significant differences are found between 10-min and 2-min data. This is due to the fact that current SMOS SSS accuracy (relative to Argo) is about 0.6–0.8 psu, thus masking the relatively smaller LO calibration frequency effect.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.