Abstract

This paper proposes and evaluates the use of machine learning (ML) techniques for mitigating the effect of the random inter-core crosstalk (ICXT) on 256 Gb/s short-reach systems employing weakly coupled multicore fiber (MCF) and Kramers–Kronig (KK) receivers. The performance improvement provided by the k-means clustering, k nearest neighbor (KNN) and feedforward neural network (FNN) techniques are assessed and compared with the system performance obtained without employing ML. The FNN proves to significantly improve the system performance by mitigating the impact of the ICXT on the received signal. This is achieved by employing only 10 neurons in the hidden layer and four input features for the training phase. It has been shown that k-means or KNN techniques do not provide performance improvement compared to the system without using ML. These conclusions are valid for direct detection MCF-based short-reach systems with the product between the skew (relative time delay between cores) and the symbol rate much lower than one (skew×symbol rate≪1). By employing the proposed FNN, the bit error rate (BER) always stood below 10−1.8 on all the time fractions under analysis (compared with 100 out of 626 occurrences above the BER threshold when ML was not used). For the BER threshold of 10−1.8 and compared with the standard system operating without employing ML techniques, the system operating with the proposed FNN shows a received optical power improvement of almost 3 dB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call