Abstract

Two schemes for adaptive detection Kelly's generalized likelihood ratio test (GLRT) and the mean level adaptive detector (MLAD), are analyzed with respect to the deleterious effect of desired-signal contamination of the data used to compute the sampled covariance matrix for the two detectors. The detection probability P/sub D/ and false alarm performance (ghosting probability P/sub G/) are predicted for the two schemes under the assumptions that the input noises are Gaussian random variables that are temporally independent but spatially correlated; and the amplitude of the desired signal is Rayleigh distributed. P/sub D/ and P/sub G/ are computed as a function of the false alarm probability P/sub F/ with no contamination, the number of input channels, the number of independent samples-per-channel, the matched filtered output signal-to-noise (S/N) power ratio, and the S/N power ratio of the contaminating desired signal. It is shown that both P/sub D/ and P/sub G/ decrease with increasing levels of contamination. The P/sub G/ performance is is almost identical for the GLRT and MLAD. The P/sub D/ performance shows similar relative performance trends. Significantly, it is shown that the ghosting probability does not exceed P/sub F/ in the presence of contamination. >

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.