Abstract

We deal with the problem of managing a project or a complex operational process by controlling the execution pace of the activities it comprises. We consider a setting in which these activities are clearly defined, are subject to precedence constraints, and progress randomly. We formulate a discrete-time, infinite-horizon Markov decision process in which the manager reviews progress in each period and decides which activities to expedite to balance expediting costs with delay costs. We derive structural properties for this dynamic project expediting problem. These enable us then to devise exact solution methods that we show to reduce computational burden significantly. We illustrate how our method generalizes and can be used to tackle a wide range of so-called stochastic shortest-path problems that are characterized by an intuitive property and can capture other applications, including medical decision-making and disease-modeling problems. Moreover, we also deal with the state identification issue for our problem, which is a challenging task in and of itself, owing to precedence constraints. We complement our analytical results with numerical experiments, demonstrating that both our solution and state identification methods significantly outperform extant methods for a supply chain example and for various randomly generated instances. This paper was accepted by Chung Piaw Teo, optimization. Funding: R. Mogre acknowledges support from the U.S.-UK Fulbright Commission and the Lloyd’s Tercentenary Research Foundation through the Fulbright-Lloyd’s Scholar Award, which allowed him to spend an extended period of time at Massachusetts Institute of Technology. Supplemental Material: The data files are available at https://doi.org/10.1287/mnsc.2023.4876 .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call