Abstract

The demand for network capacity has increased due to the introduction of new digital applications and services, which rely heavily on optical communication networks. While fiber networks serve as the optical networks’ backbone, deploying fiber in certain scenarios is not feasible, making it necessary to use other technologies conjointly. A hybrid all-optical fiber/free space optic (FSO) link is proposed to avoid such a challenge. The all-optical system avoids using electronics that have limited bandwidth. Hence, it supports high-capacity communication. However, the all-optical system comes with challenges arising from fiber and FSO channel impairments. To monitor the amount and type of distortion in the optical channel, machine learning (ML) techniques are exploited. In this work, Gaussian process regression (GPR) is utilized as an ML technique to predict three main channel impairments that arise in the hybrid all-optical fiber/FSO channels, which are turbulence, optical signal-to-noise ratio (OSNR), and chromatic dispersion (CD). The model’s performance is evaluated using boxplot graphs, root mean square error (RMSE) metric, and R-squared metric. The results indicate that the model can predict the various impairments with high accuracy, except under strong amplified spontaneous emission (ASE) noise, where the model demonstrated lower accuracy in predicting light turbulence parameters. The proposed approach provides a self-aware and self-adaptive communication system and can optimize network resources in the future.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call