It is a critical issue to allocate redundancy to critical smart grid infrastructure for disaster recovery planning. In this study, a framework to combine statistical prediction methods and optimization models for the optimal redundancy allocation problem is presented. First, statistical simulation methods to identify critical nodes of very large-scale smart grid infrastructure based on the topological features of embedding networks are developed, and then a linear integer programming model based on generalized assignment problem (GAP) for the redundancy allocation of critical nodes in smart grid infrastructure is presented. This paper aims to contribute to the field by employing a general redundancy allocation problem (GRAP) model from high-order nonlinear to linear model transformation. The model is specifically implemented in the context of smart grid infrastructure. The innovative linear integer programming model proposed in this paper capitalizes on the logarithmic multiplication property to reframe the inherently nonlinear resource allocation problem (RAP) into a linearly separable function. This reformulation markedly streamlines the problem, enhancing its suitability for efficient and effective solutions. The findings demonstrate that the combined approach of statistical simulation and optimization effectively addresses the size limitations inherent in a sole optimization approach. Notably, the optimal solutions for redundancy allocation in large grid systems highlight that the cost of redundancy is only a fraction of the economic losses incurred due to weather-related outages.
Read full abstract