Abstract

The problem of constructing adaptive minimum bit error rate (MBER) linear multiuser detectors is considered for direct-sequence code division multiple access (DS-CDMA) signals transmitted through multipath channels. Based on the approach of kernel density estimation for approximating the bit error rate (BER) from training data, a least mean squares (LMS) style stochastic gradient adaptive algorithm is developed for training linear multiuser detectors. Computer simulation is used to study the convergence speed and steady-state BER misadjustment of this adaptive MBER linear multiuser detector, and the results show that it outperforms an existing LMS-style adaptive MBER algorithm presented by Yeh et al. (see Proc. Globecom, Sydney, Australia, p.3590-95, 1998).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.