Abstract
Interferometric noise, arising on the optical interference of the desired information signal and parasitic crosstalk waveforms at the photodetector, afflicts practically all lightwave communication networks, inducing unacceptable power penalties and bit error rate floors. In this paper, the induced performance degradation is quantified, both experimentally and analytically, and solution paths are identified. It is concluded that the total crosstalk level of noise generating parasitics in a generalized optical network must be held below -25 dB for a penalty of less than 1 dB-a further 2 to 4 dB may lead to network failure; otherwise, means of suppressing the noise by RF rejection at the receiver must be invoked. A number of approaches to achieving a reduction in the level of interferometric noise are presented and contrasted.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.