Abstract

In this paper, we present a simple analytical model to characterize the effect of fiber chromatic dispersion when using a multisection distributed-Bragg reflector (DBR) semiconductor laser as a millimeter-wave optical transmitter in a millimeter-wave fiber-radio system. We characterize the dispersion penalty of the laser as a function of the laser operating conditions and establish that the penalty is dependent on the distribution of optical power among the modes in the laser output. This, in turn, is dependent on the spectrum-filtering property of the laser DBR section and the gain profile of the laser. In addition to the dispersion penalty, the stability of the generated millimeter-wave carrier from the multisection laser is investigated, including the detected RF power and resulting phase noise. We establish that a compromise must be made when finding the optimum bias condition of the laser which provides minimum dispersion penalty, maximum received RF power, and minimum phase noise of the generated millimeter-wave carrier.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call