Abstract

This article studies a dynamic advance scheduling problem where multi-type patients arrive randomly to book the future service of a diagnostic facility in a public healthcare setting. The demand for the diagnostic facility generally arises from multiple sources such as emergency patients, inpatients, and outpatients. It is challenging for public hospital managers to dynamically allocate their limited capacities to serve the incoming multi-type patients not only to achieve their heterogeneous waiting time targets in a cost-effective manner but also to maintain equity among multi-type patients. To address this problem, a finite-horizon Markov Decision Process (MDP) model is proposed to minimize the total expected costs under the constraints of maintaining equity. Because of the complex structure of the feasible region and the high-dimensional state space, the property characterization of optimal scheduling policy and the exact solution of the MDP are intractable. To solve the MDP model with high-dimensional state and action spaces, we reformulate the MDP as a multi-stage stochastic programming model and propose a modified Benders decomposition algorithm based on new dual integer cuts to solve the model. Based on real data from our collaborating hospital, we perform extensive numerical experiments to demonstrate that our proposed approach yields good performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call