Abstract

The signal-to-interference-plus-noise ratio of the noised single-sideband optical orthogonal frequency-division multiplexing (SSB-OOFDM) signal in a direct-detection OFDM system is first deduced theoretically. The relationship of the carrier-to-sideband ratio (CSR) and guard band (GB) to the signal-to-signal beat interference (SSBI) is explored. According to our theoretical analysis, the degradation caused by SSBI on the SSB-OOFDM signal becomes worse as the GB reduces, whereas the increase in the CSR can loosen this degradation. Therefore, a tradeoff between the GB and CSR balances their joint influence on system performance. The simulation for the 20-km optical link with 40-Gb/s 16-quadrature amplitude modulation (16-QAM) noised SSB-OOFDM signal is conducted to confirm our theoretical results. It shows that, without any complex device or algorithm, the GB can be reduced to 40% only by increasing the CSR to $\sim$ 8 dB, and thus, the spectral efficiency of the DDO-OFDM link is improved after optimizing the CSR and GB of the SSB-OOFDM signal. In addition, the influence of the optical signal-to-noise ratio and the electronic filter on the optimum CSR of the system is further analyzed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call