Abstract

For optimizing the performance of optical code-division multiple-access (CDMA) systems, there is a need for determining the sensitivity of the bit-error rate (BER) of the system to various system parameters. Asymptotic approximations and bounds, used for system bit-error probabilities, seldom capture the sensitivities of the system performance. We develop single-run gradient estimation methods for such optical CDMA systems using a discrete-event dynamic systems (DEDS) approach. Specifically, computer-aided techniques such as infinitesimal perturbation analysis (IPA) and likelihood ratio (LR) methods are used for analyzing the sensitivity of the average BER to a wide class of system parameters. It is shown that the above formulation is equally applicable to time-encoded and frequency-encoded systems. Further, the estimates derived are unbiased, and also optimality of the variance of these estimates is shown via the theory of common random variates and importance sampling techniques.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.