Abstract
Linear detector required at direct-sequence code division multiple access (DS-CDMA)communication systems is classically designed based on the minimum mean squares error (MMSE) criterion, which can efficiently be implemented using the standard adaptive algorithms, such as the least mean square (LMS) algorithm. As the probability distribution of the linear detector's soft output is generally non-Gaussian,the MMSE solution can be far away from the optimal minimum bit error rate (MBER) solution. Adopting a non-Gaussian approach naturally leads to the MBER linear detector. Based on the approach of Parzen window or kernel density estimation for approximating the probability density function (p.d.f.), a stochastic gradient adaptive MBER algorithm, called the least bit error rate (LBER), is developed for training a linear multiuser detector. A simplified or approximate LBER (ALBER) algorithm is particularly promising, as it has a computational complexity similar to that of the classical LMS algorithm. Furthermore, this ALBER algorithm can be extended to the nonlinear multiuser detection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.