Abstract

In an observable queue, customers joining decisions may be influenced by wait-aversion and crowd-attraction. These opposing phenomena and the diversity of arriving customers lead to an arrival process that depends on the number of present customers. For the system manager, having more customers may be beneficial as it can increase future arrivals due to the attraction generated. It may also saturate the system, resulting in long waits. Rejection at arrival may then be employed as a way to obtain a trade-off between these conflicting objectives. With this in mind, we developed a Markov decision process approach to determine how to optimally reject customers in a queueing system with state-dependent arrivals.When the arrival rate is bounded, we compute the optimal policy from a value iteration approach. When the arrival rate is decreasing and convex, we prove that it has a threshold form. When the arrival rate is increasing and potentially unbounded, uniformization may not apply. In dealing with this case, we restrict the analysis to stationary policies and prove the optimality of threshold policies from a computational approach. In addition, we show how to compute the optimal threshold within a finite number of iterations and prove that the long-run expected cost is decreasing and convex in the number of servers. We finally illustrate the applicability of our results through the analysis of a linearly increasing arrival rate, determining the main drivers of control decisions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call