Abstract

Radiative transfer model (RTM) simulated microwave (MW) brightness temperatures (Tbs) are commonly used to monitor and evaluate space-based MW sensor-observed antenna temperature (Ta) data. Although these simulated Tbs are paramount to this data integrity maintenance activity, their uncertainties have not been quantified. This study develops and implements a method to estimate these simulated Tb uncertainties based on a statistical comparison of two Community RTM (CRTM)-simulated operational MW sounder Tb data sets, separately generated using the European Center for Medium Range Forecasts (ECMWF) Atmospheric Model High Resolution (HRES) and Global Navigation Satellite System (GNSS) Radio Occultation (RO) sounding inputs. The study shows the smallest single-sensor CRTM-simulated Tb uncertainties, computed from differences of ECMWF HRES and GNSS RO soundings-based simulated Tbs data sets, are on the order of 10−4 relative to a 300 K Tb for the two NOAA operational MW sounder channels with low- to mid-tropospheric peak weighting function sensitivity. Meanwhile, inter-sensor-simulated Tb differences, computed from the double difference of single-sensor-simulated Tb differences, lead to CRTM-simulated Tb uncertainties on the order for 10−4 for at least nine MW sounder channels with peak sensitivity from the low-troposphere to the low-stratosphere. These findings provide the basis of future work to assess the ability to identify and quantify suspected on-orbit MW sounder calibration anomalies using RTM-driven, on-orbit MW instrument monitoring techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call