Abstract

I. Norros has done well to remind us of the complementary facets of teletraffic theory and of its sister discipline, performance modelling: Beginning with the early days of telephony, traffic engineering has firmly demonstrated its practical usefulness in helping develop telecommunications systems. It has done so by leveraging key insights derived from simple yet robust traffic models. In the process, the field has come to encompass a vast and varied intellectual landscape where many have been at play with great relish! This tension between the utilitarian and the playful is a healthy one, not to be forgotten, fueling a rich mix of activities, some of which were touched upon in the paper. The sample is admittedly small, but contains already some of the beautiful contributions we have come to expect. Given that such selections are necessarily subjective, I would like to briefly discuss two areas of applied probability which are relevant to telecommunication problems. I expect that further strengthening their ties to traffic theory will keep applied probabilists busy for a long time to come! 1. Limit theorems as a modelling engine Since its inception, traffic theory has relied on limit theorems as an effective modelling paradigm. The Palm-Khintchin theorem is the earliest and best known example along these lines in the many caller regime, this celebrated result identifies Poisson processes as the robust model for call arrivals to a telephone exchange. The approach has been emulated many times over in a number of different settings, viz. the Poisson process for describing the aggregation of a very large number of bursty data sources, the macroscale modelling of many TCP flows, the emergence of fractional Brownian motion and MIGloo processes as models for long-range dependent traffic patterns, and more recently, the exponentially distributed path link durations in wireless mobile networks in the many user regime. This limiting approach is a natural one for modelling telecommunication systems where problems of resource allocation become more pressing, hence more interesting, at high utilization, e.g. when the number of users is large in relation to available resources or when demand approaches capacity. In such situations, the limit regime provides a good starting point for the provisioning of network resources. The relevant model arises through a limiting procedure, possibly after rescaling an appropriate quantity, as the scaling parameter converges to its natural limit. There are several potential benefits to this modelling process: First, model simplification, with the promise of scalability, typically occurs when applying limit theorems - irrelevant details are filtered out without relying on ad-hoc assumptions. Secondly, limit theory is central to the modern theory of probability, and as such, has been the focus of a huge literature replete with results and techniques. Given this large body of knowledge, it is reasonable to expect the existence of suitable limit theorems which can be applied, under very weak assumptions, to many situations of interest. This should lead to models which are robust and of wide applicability. This limiting approach is likely to be used repeatedly as new telecommunication services and infrastructures emerge. Developing the adequate limiting results under the weakest possible assumptions then constitutes a natural objective for traffic engineering. The challenge is

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call