Radio-frequency interference (RFI) present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.
Read full abstract