Abstract

This paper studies a control problem for optimal switching on and off a cloud computing services modeled by an M=M=1 queue with holding, running and switching costs. The main result is that an average-optimal policy either always runs the system or is an (M; N)- policy defined by two thresholds M and N, such that the system is switched on upon an arrival epoch when the system size accumulates to N and it is switched off upon a departure epoch when the system size decreases to M. We compare the optimal (M; N)-policy with the classical (0; N)-policy and show the non-optimality of it.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call