Abstract

Abstract The paper investigates the impact of Rayleigh-distributed statistical behavior of peak-to-average power ratio (PAPR) associated with a pre-clipped signal on the performance metrics of a direct current-biased optical orthogonal frequency division multiplexing (DCO-OFDM) system. The analytical model for the system takes into consideration a pre-clipped and dc-shifted baseband OFDM signal, driving an optical source over its linear operating range. The model employs a bias-scaling factor, which is heuristically varied over the entire range (0 to 1) to examine improvement in overall power efficiency. Further, it utilizes the cumulative distribution function (CDF) of the pre-clipped signal to get a weighted estimate of the available signal power within the clipped PAPR. The model also takes into consideration the clipping noise effects due to limited linearity of the optical source during electrical-to-optical conversion of baseband OFDM signal. Using this model, the paper aims to arrive at a realistic estimate of the system behavior in terms of bit error rate, electrical power-efficiency and spectral efficiency. Using theoretical simulation results, for a given set of operating parameters viz., signal power, PAPR, bias-scaling factor, modulation order and sub-carrier count, the paper examines the trade-offs involved in optimizing the performance metrics over appropriate dynamic range of the DCO-OFDM transmitter.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.