Abstract

The state-of-the-art goodness-of-fit (GoF) test algorithms for spectrum sensing directly make decisions based on temporal samples or energies of the observations, which perform well under the condition that the received primary user (PU) signals are independent. However, when the received PU signals are highly correlated, these methods cannot achieve satisfactory performance. In this case, the correlation feature is the main character of the signals and the maximum eigenvalue of the sample covariance matrix is a versatile statistic reflecting the correlation feature. Motivated by this, we make full use of the correlation feature involved in the eigenvalues to improve the GoF detection performance. Specifically, we first study the GoF test in eigenvalue domain and design a semi-blind maximum eigenvalue-based GoF detection scheme using the ratio of the maximum eigenvalue to the noise power. Considering that the accurate knowledge of the noise power is not always available in practice, we next design a totally blind maximum eigenvalue-based GoF detection method, which only uses the ratio of the maximum eigenvalue to the trace. Utilizing the recent results of random matrix theory, we provide a theoretical analysis of the proposed methods, including the probabilities of false alarm, detection thresholds, and probabilities of detection. Finally, simulation results show that the proposed algorithms outperform the related GoF detection methods in terms of the detection performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call