Abstract

Following the occurrence of an extreme natural or man-made event, community recovery management should aim at providing optimal restoration policies for a community over a planning horizon. Calculating such optimal restoration policies in the presence of uncertainty poses significant challenges for community leaders. Stochastic scheduling for several interdependent infrastructure systems is a difficult control problem with huge decision spaces. The Markov decision process (MDP)-based optimization approach proposed in this study incorporates different sources of uncertainties to compute the restoration policies. The computation of optimal scheduling presented herein employs the rollout algorithm, which provides an effective computational tool for optimization problems dealing with real-world large-scale networks and communities. The proposed methodology is applied to a realistic community recovery problem, where different decision-making objectives are considered. The approach accommodates current restoration strategies employed in recovery management; computational results indicate that the restoration policies identified herein significantly outperform the current recovery strategies. Finally, the applicability of the method to address different risk attitudes of policymakers, which include risk-neutral and risk-averse attitudes in the community recovery management, is examined.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call