Abstract The paper studies the performance of a free space optical communication link, which consists of a Gaussian Laser beam employing direct-current-biased optical frequency division multiplexing (DCO-OFDM) modulation, to carry information. While optical multi-carrier modulation is capable of alleviating multi-path fading, higher transmit power by virtue of dc-bias, helps the optical beam in partially off-setting signal intensity loss due to beam divergence in free space channel. In order to reduce the overall transmitted power apart from avoiding nonlinearities due to high peak-to-average-power-ratio (PAPR), we utilize a PAPR-dependent dc-bias, to generate a DCO OFDM light beam from a pre-clipped electrical baseband OFDM signal. The radiated optical beam encounters turbulence-induced impairments like beam-spreading, beam-wander, etc., which affect the signal-to-noise ratio (SNR) of the received data symbols. Further signal pre-clipping and optical source linearity generate clipping noise components resulting in compromised receiver performance or reduction of span length, to achieve target bit rate error (BER). In this paper, we derive an analytical model for evaluating the performance of the free-space optical (FSO) link, by modeling various impairments as noise variances and characterizing atmospheric turbulence by refractive index structure parameter under weak- and strong-turbulent conditions. Numerical results are obtained by varying system design parameters in the model, which graphically provide useful insight on the power penalties and various trade-offs involved in operating the link on a longer span, at a higher data rate, and with reduced transmitted power through pre-clipping while ensuring desired signal BER.
Read full abstract