Abstract

In this paper we present a theory of the bit error rate (BER) of Euclidean metric-based maximum likelihood sequence detectors (EM-MLSD) in the presence of channel mismatch caused by nongaussian noise. Although the theory is general, here we focus on the effects of quantization noise (QN) added by the front-end analog-to-digital converter (ADC) typically used in DSP based implementations of the receiver. Numerical results show a close agreement between the predictions of the theoretical analysis and computer simulations. As a practical application of the proposed theory, we investigate the performance of EM-MLSD in 10Gb/s Ethernet receivers for multimode optical fibers. Since the BER required in this application is below 10 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">-12</sup> , which precludes the use of computer simulations to estimate BER, a theoretical study of the MLSD performance including the combined effects of the channel dispersion and QN, becomes necessary. We present numerical results for the three stressors specified by the 10GBASE-LRM standard. Our study shows that the impact of the QN added by the ADC on the performance depends strongly on the channel dispersion (i.e., the stressor).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.