Abstract

A K-user direct-sequence spread-spectrum code-division multiple-access (CDMA) system with (q ≪ log2K)-bit baseband signal quantization at the demodulator is considered. It is shown that additionally quantizing the K + 1 level output signal of the CDMA modulator into q bits improves significantly the average bit-error performance in a non-negligible regime of noise variance, σ2, and user load, β, under various system settings, like additive white Gaussian noise (AWGN), Rayleigh fading, single-user detection, multi-user detection, random and orthogonal spreading codes. For the case of single-user detection in random spreading AWGN-CDMA, this regime is identified explicitly as , where γ(q) is a certain pre-factor depending on q, and the associated BER improvement is derived analytically for q = 1, 2. For the other examined system settings, computer simulations are provided, corroborating this interesting behavior.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call