Abstract

Data teletraffic is characterized by bursty arrival processes. Performance models are characterized by a desire to know under what circumstances is the probability that an arrival finds a full input buffer very small. In this paper I examine how four models proposed in the literature perform on two data sets of local area network traffic. Among my conclusion are (1) the protocol governing the data transmission may have a substantial effect on the statistical propoerties on the packet stream, (2) approximating the probability that a finit buffer of size b overflows may not be adequately approximated by the probability that an infinite buffer has at least b packets in it, and (3) a data-based estimate of large-deviation rate-function does the best job of estimating packet loss on these data sets. This method may overestimate the loss rate by several orders of magnitude, so there is room for further refinements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call