Abstract

A K-user direct-sequence spread-spectrum code- division multiple-access (CDMA) system with (q << log <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sub> K) -bit baseband signal quantization at the demodulator is considered. It is shown that additionally quantizing the K + 1 level output signal of the CDMA modulator into q bits improves significantly the average bit-error performance in a non-negligible regime of noise variance, sigma <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> , and user load, beta, under various system settings, like additive white Gaussian noise (AWGN), Rayleigh fading, single-user detection, multi-user detection, random and orthogonal spreading codes. For the case of single-user detection in random spreading AWGN-CDMA, this regime is identified explicitly as sigma < gamma(q)yradicbeta, where gamma(q) is a certain pre-factor depending on q, and the associated BER improvement is derived analytically for q = 1, 2. For the other examined system settings, computer simulations are provided, corroborating this interesting behavior.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call