In order to reduce the energy cost of data centers, recent studies suggest distributing computation workload among multiple geographically dispersed data centers, by exploiting the electricity price difference. However, the impact of data center load redistribution on the power grid is not well understood yet. This paper takes the first step towards tackling this important issue, by studying how the power grid can take advantage of the data centers' load distribution proactively for the purpose of power load balancing. We model the interactions between power grid and data centers as a two-stage problem, where the utility company chooses proper pricing mechanisms to balance the electric power load in the first stage, and the data centers seek to minimize their total energy cost by responding to the prices in the second stage. We show that the two-stage problem is a bilevel quadratic program, which is NP-hard and cannot be solved using standard convex optimization techniques. We introduce benchmark problems to derive upper and lower bounds for the solution of the two-stage problem. We further propose a branch and bound algorithm to attain the globally optimal solution, and propose a heuristic algorithm with low computational complexity to obtain an alternative close-to-optimal solution. We also study the impact of background load prediction error using the theoretical framework of robust optimization. The simulation results demonstrate that our proposed scheme can not only improve the power grid reliability but also reduce the energy cost of data centers.
Read full abstract