Abstract
For the elimination of radio-frequency interference (RFI) in a passive microwave radiometer, the threshold level is generally calculated from the mean value and standard deviation. However, a serious problem that can arise is an error in the retrieved brightness temperature from a higher threshold level owing to the presence of RFI. In this paper, we propose a method to detect and mitigate RFI contamination using the threshold level from statistical criteria based on a spectrogram technique. Mean and skewness spectrograms are created from a brightness temperature spectrogram by shifting the 2-D window to discriminate the form of the symmetric distribution as a natural thermal emission signal. From the remaining bins of the mean spectrogram eliminated by RFI-flagged bins in the skewness spectrogram for data captured at 0.1-s intervals, two distribution sides are identically created from the left side of the distribution by changing the standard position of the distribution. Simultaneously, kurtosis calculations from these bins for each symmetric distribution are repeatedly performed to determine the retrieved brightness temperature corresponding to the closest kurtosis value of three. The performance is evaluated using experimental data, and the maximum error and root-mean-square error (RMSE) in the retrieved brightness temperature are served to be less than approximately 3 K and 1.7 K, respectively, from a window with a size of 100 × 100 time–frequency bins according to the RFI levels and cases.
Highlights
The presence of radio-frequency interference (RFI) is a significant issue in radiometric measurements, potentially caused by any type of electromagnetic emission, such as those from communication and navigation systems
The aim of this paper is to demonstrate RFI detection capabilities and to present a mitigation algorithm for the purpose of minimizing potential errors originating from the sample intensity level using the threshold level from statistical criteria based on a spectrogram technique
The relatively strong mean spectrogram become larger than those of the brightness temperature when the window size is RFI signals of levels approximately 50 K and 100 K are detected in the spectrograms in Figures 8 increased
Summary
The presence of radio-frequency interference (RFI) is a significant issue in radiometric measurements, potentially caused by any type of electromagnetic emission, such as those from communication and navigation systems. Even if microwave radiometry is utilized for measuring sea salinity and soil moisture within the protected bandwidth range of 1400 to 1427 MHz, RFI can still degrade radiometric measurements owing to leakages from adjacent bands [1,2]. The effect on the RFI adjacent to the land is critical for the ground-based microwave radiometer for remote sensing outdoor application when compared with the measurements from the open ocean using airborne and satellite [3,4,5,6]. Ground-based microwave radiometers have been used for Internet of Things (IoT) applications [10] and for effective detection and mitigation of RFI contamination. A powerful means of RFI detection and mitigation is the use of the kurtosis
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.