Abstract

Two finite-capacity loss queues in series, with Poisson arrival at the first queue and with Erlang service time distribution at both queues are considered. It is assumed that the service rate at the second queue is held fixed and that a maximum buffer space is globally available. For this model, the problem of optimally choosing the service rate at the first queue and the buffer space distribution over the two queues is considered in order to minimize the steady-state total network loss rate. It is shown that a decrease of the service rate at the first queue can reduce the total network loss rate only if this lower service rate decreases the variability of the process feeding the second queue. Numerical results show that, when optimal values are assigned to the parameters of the model, a significant reduction of the network loss rate can be achieved, especially when the network load is moderate and when the service time distributions have a small variance. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call