Abstract

The quality control of meteorological data has always been an important, if not always fully appreciated, step in the use of the data for analysis and forecasting. In most quality-control approaches, erroneous data are treated as nonrandom ‘‘outliers’’ to the data distribution, which must be eliminated. The elimination of such data traditionally proceeds from coarse to finer filters. More recent methods use the fit (or lack of fit) of such data to an analysis, excluding the data, to determine whether data are acceptable. The complex quality-control (CQC) approach, on the other hand, recognizes that most rough errors are caused by human error and can likely be corrected. In the CQC approach, several independent checks are made that provide numerical measures of any error magnitude. It is only after all check magnitudes, called residuals, are calculated that data quality is determined and errors are corrected when possible. The data-quality assessment and correction is made by the sophisticated logic of the decision-making algorithm (DMA). The principles and development of the method of CQC for radiosonde data were given by Gandin. The development of CQC at the National Centers for Environmental Protection (NCEP) for the detection and correction of errors in radiosonde heights and temperatures, called the complex quality control for heights and temperatures (CQCHT), has progressed from the use of a complex of hydrostatic checks only to the use of statistical and other checks as well, thereby becoming progressively sophisticated. This paper describes a major restructuring in the use of the radiosonde data and in the logical basis of the DMA in the operational CQCHT algorithm at NCEP so that, unlike the previous implementations, all data levels are treated together, thus potentially allowing the correction at any level to influence subsequent correction at adjacent levels, whether they are mandatory or significant. At each level, treated one by one from the surface upward, all available checks are used to make the appropriate decisions. Several vertical passes may be made through the data until no more corrections are possible. Final passes look for ‘‘observation’’ errors. The methods of error determination are outlined, and the effect of errors on the residuals is illustrated. The calculation of residuals is described, their availability for each type of data surface (e.g., earth’s surface, mandatory level, significant level) is given, and their use by the DMA is presented. The limitations of the use of various checks are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call