The purpose of this paper is to study how resequencing packets in a communication network may affect the performance of the application needing the information. We analyze a simple queueing model where customers may be randomly disordered, before arriving at a single server queue. The performance index chosen is the variance of the server waiting time, which is a measure of the `jitter' suffered by the application. The analysis reveals that not resequencing customers does improve the service regularity for a wide range of loss probabilities and service times. It shows also that, contrary to the conclusions of previous analyses, resequencing does not necessarily make the performance of the system worse.
Read full abstract