Abstract

Adaptive training of neural networks is typically done using some stochastic gradient algorithm that aims to minimize the mean square error (MSE). For many classification applications, such as channel equalization and code-division multiple-access (CDMA) multiuser detection, the goal is to minimize the error probability. For these applications, adopting the MSE criterion may lead to a poor performance. A nonlinear adaptive near minimum error rate algorithm called the nonlinear least bit error rate (NLBER) is developed for training neural networks for these kinds of applications. The proposed method is applied to downlink multiuser detection in CDMA communication systems. Simulation results show that the NLBER algorithm has a good convergence speed and a small-size radial basis function network trained by this adaptive algorithm can closely match the performance of the optimal Bayesian multiuser detector. The results also confirm that training the neural network multiuser detector using the least mean square algorithm, although generally converging well in the MSE, can produce a poor error rate performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.