Abstract

In this paper, we study the loss and delay of data bursts in an optical buffer. We assume that this buffer consists of a number of fiber delay lines (FDLs). In order to guarantee quality of service (QoS) differentiation in such a buffer, we investigate analytically an offset-time based scheduling mechanism. We consider a system with C QoS classes, where the high-priority QoS classes have a larger offset time than the low-priority QoS classes. For this system, we calculate the total burst loss probability and the burst loss probability within each QoS class. Furthermore, we study the delay of an arbitrary arriving data burst, as well as the delay of an arriving data burst of a certain QoS class.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call