Abstract

A batch service queue is considered where each batch size and its time of service is subject to control. Costs are incurred for serving the customers and for holding them in the system. Viewing the system as a Markov decision process (i.e., dynamic program) with unbounded costs, we show that policies which minimize the expected continuously discounted cost and the expected cost per unit time over an infinite time horizon are of the form: at a review point when x customers are waiting, serve min {x, Q} customers (Q being the, possibly infinite, service capacity) if and only if x exceeds a certain optimal level M. Methods of computing M for both the discounted and average cost contexts are presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call