Abstract

A theoretical model for the noise analysis of the system performance of 1.55-μm single-frequency semiconductor lasers is presented. Computer simulations are used to analyze the role of various noise sources in a 1.7-Gbit/s transmission experiment where the data was transmitted over 69 km using a 1.56-μm distributed-feedback laser. The bit-error-rate curves generated from numerical simulations agree well with the results of the transmission experiment. The relative contributions of various noise sources in limiting the system performance are discussed and compared. In particular, we consider circuit noise, shot noise, laser intensity noise, mode-partition noise, parasitic reflections, and the frequency chirp.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call