Abstract

Non-coherent optical transceivers have gained much attention with the rise of visible light communication in 5G networks. Transceivers based machine learning has been recently proposed motivated by the tradeoff between spectral and power efficiency of asymmetrical clipping optical OFDM (ACO-OFDM) and direct clipping optical OFDM (DCOOFDM). In this paper, we propose regression decision tree (RDT) based optical transceiver, that predicts the transmitted signal at the receiver side. The proposed transceiver compensates the clipping noise produced by clipping the negative parts of the transmitted signal. We evaluate and analyze the RDT based optical transceiver architectures for different performance aspects and compared the results with benchmarks techniques and alternative deep neural network (DNN) transceiver. The proposed optical OFDM transceiver enhances the spectral and power efficiency compared to the latest works.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call