Abstract

Spectrum sensing is of utmost importance in cognitive radio and dynamic spectrum access systems for achieving spectrum awareness. To provide reliable spectrum awareness, it is critical to develop (near-) optimal sensing techniques and understand achievable performance limits. In this paper, we study likelihood ratio test (LRT)-based methods for detection of a signal of interest (SOI) assuming multiple receive antennas in the presence of spatially correlated additive noise (colored noise). We show that with on/off status information of the SOI available at receiver, a training-based generalized likelihood ratio test (TB-GLRT) method can be designed which approximates the optimal LRT estimator-correlator (EC) detector. Using the inverse Laplace transform and complex Wishart distribution theory, we derive formulas for detection and false alarm probabilities of both LRT-EC and GLRT schemes. The convergence of the TB-GLRT to the LRT-EC is analyzed and proved based on convergence behaviors of the detectors and their decision statistics. Simulation results verify the analytical convergence properties. In addition, the results illustrate the effects of different system design parameters, and demonstrate that the TB-GLRT scheme can provide substantial performance improvement over several existing GLRT methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call