Multiple-input multiple-output (MIMO) receivers designed to optimize the minimum mean square error (MMSE) are a common choice in coherent optical communication systems based on spatial division multiplexing (SDM). This kind of receivers naturally integrate both MIMO equalization and matched filtering functions. However, when the optical channel exhibits significant mode-dependent loss (MDL) and/or mode-dependent gain (MDG), the impact of inter-symbol interference (ISI) and crosstalk that arise, even using an ideal MIMO MMSE linear receiver, is barely analyzed. Moreover, due to the random nature of the MDL/MDG model, the resulting ISI, crosstalk, and bit error rate (BER) also become random variables and, hence the system performance is more unpredictable. In this paper, we first evaluate the residual distortion (ISI and crosstalk) after the MIMO receiver and then we study the validity of assuming it as an additional Gaussian noise term independent of the channel noise. Next, the probability density distribution (PDF) of the BER is analyzed, from both an analytical perspective and numerical simulations. For the latter, we use a single-carrier 2-PAM (pulse amplitude modulation) system, with pulse shaping, and the MIMO MMSE receiver implementation by means of a MIMO fractionally-spaced equalizer (FSE). We carry out simulations of the system under different conditions of MDL/MDG level and signal to noise ratio (SNR), measured at the receiver input. Additionally, we address possible fits of the BER PDF to known closed-form distributions, among which the Generalized Extreme Value (GEV) family of distributions is selected, and polynomial functions are proposed that relate the system parameters with the GEV PDF parameters. Finally, we present contour maps of BER according to a giving target of system outage probability (OP) that depend on the MDL/MDG and SNR conditions.
Read full abstract