Abstract

There have been a large number of studies focused on the characterization of the upstream delay in time-division multiplexing passive optical networks (TDM-PONs). However, most of them focus on finding equations for the average delay and ignore other useful metrics like delay percentiles, which are of paramount interest in dimensioning PONs with delay guarantees. This work shows how to learn delay models from data using supervised machine learning (ML) techniques. Essentially, a nonlinear regression ML algorithm is trained with PON simulation data, showing that it can provide accurate equations for such metrics of interest. In particular, we obtain an R 2 score above 80% under Poisson traffic and above 65% under self-similar traffic, and we provide a general equation for any delay percentile in the upstream channel of a PON employing interleaved polling with adaptive cycle time. We further show its applicability in dimensioning Tactile Internet and 5G transport support scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call