Abstract

Toward the next-generation ultra-long-haul optical network, an extremely gradient boosting (XGBoost)-aided machine learning (ML) model is proposed to maximize the flexibility and uniformity in the performance of distributed Raman amplifier (DRA). In order to achieve an accurate prediction of desired signal gain spectrum and bit error rate (BER), a novel decision-tree based system is employed against inconsistent dimensionality between pump frequency and power. The impact of various model evaluation techniques: mean squared error (MSE), coefficient of determination (R <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> ), root mean square measured data ratio (RSR) and the Nash-Sutcliffe coefficient (NSE) are discussed in detail. It is shown that the proposed method can diagnose the fault within 2.3 ms with accuracy of 99.6% and has also the highest estimation and efficacy in comparison with other ML based tree models. The reported work transforms the successful implementation of XGBoost model to estimate the desired gain profile and BER of DRA in low-loss optical wavelength region (1260-1650nm).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call