Abstract

The optical burst switching (OBS) burstifier delay-throughput curves are analyzed in this paper. The burstifier incorporates a timer-based scheme with minimum burst size, i.e., bursts are subject to padding in light-load scenarios. Precisely, due to this padding effect, the burstifier normalized throughput may not be equal to unity. Conversely, in a high-load scenario, padding will seldom occur. For the interesting light-load scenario, the throughput-delay curves are derived and the obtained results are assessed against those obtained by trace-driven simulation. The influence of long-range dependence and instantaneous variability is analyzed to conclude that there is a threshold timeout value that makes the throughput curves flatten out to unity. This result motivates the introduction of adaptive burstification algorithms, which provide a timeout value that minimizes delay, yet keeps the throughput very close to unity. The dependence of such optimum timeout value with traffic long-range dependence and instantaneous burstiness is discussed. Finally, three different adaptive timeout algorithms are proposed, which trade off complexity versus accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call