SummaryThe bit error rate (BER) in optical communication systems is often get degraded due to various factors like launch power, dispersion, modal noise, and so on. Finding the most optimal launch power for a signal to provide acceptable BER is usually difficult on an installed link. Therefore, in this paper, an attempt has been made to use machine learning‐based linear regression technique for predicting the optimal signal quality for spatial division multiplexed (SDM)‐based fiber optical transmission system for a fixed distance. This technique helps in predicting the optimal value of continuous launch power. Therefore, to demonstrate the given concept in this research work, the generic setup of an SDM optical long‐haul system of 20 km in length has been designed and simulated in Optisystem‐14.0. The light sources in our experiments are two spatial optical transmitters with an emission wavelength of around 1550 nm. Abiding to the first step for the regression analysis, that is the data preparation, normality and correlation checks were performed. After this, a linear regression model is developed which was validated through a summary report, coefficients and diagnostic plots. Furthermore, the accuracy of the model is improved by employing Cook's distance. These tests help in dealing with the influence points that hinders the prediction ability of the proposed model. The results show that the R‐squared value is 0.9366, and value of adjusted R2 comes out to be around 0.9344; we therefore came to infer that the model explains nearly 94% variations in the dependent variables.
Read full abstract