Abstract

Stochastic resource-constrained project scheduling problem is an extension of resource-constrained project scheduling problem such that activity duration has stochastic nature. In real situation where activity duration is not known until the activity is finished, open-loop based static policies such as activity-based policy and priority-based policy will not well cope with duration variability. Then, a dynamic policy based on closed-loop decision making will be regarded as an alternative toward achievement of minimal makespan. In this study, a dynamic policy designed to select activities to start at each decision time point is illustrated. The performance of static and dynamic policies based on variable neighborhood search is evaluated under the discrete-event simulation environment. Experiments with J120 sets in PSPLIB and several probability distributions of activity duration show that the dynamic policy is superior to static policies. Even when the variability is high, the dynamic policy provides stable and good solutions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.