Abstract

To reduce cooling energy consumption, data centers are recommended to raise temperature setpoints of server intake. However, in tropical climates, Data Center operators are still found to be operating at lower temperatures. In this paper, we demonstrate that using a floating setpoint with a lowered temperature value for tropical climates reduces the overall energy consumption of Data Centers as opposed to raising the temperature in a static manner. We achieve this by applying a deep reinforcement learning algorithm to a hybrid data center model that was built from data collected off a highly efficient data center. This generates an optimal control strategy which minimizes the costs of energy consumption while operating within the required set of operational constraints. Following which, we evaluate the behavior of the control strategy to account for the exact sources of energy savings. The deep reinforcement learning algorithm learns by continually interacting with the built Data Center model without any prior knowledge of the Data Center. The algorithm is trained under the full-load and the part-load configuration of the Data Center. Testing results show that further energy savings of up to 3% and 5.5% (under full load and part load respectively) can be achieved with targeted cooling provisioning while operating within constraints in an already cooling-efficient Data Center. We find that while building level optimization studies of Data Centers generally improve energy efficiency, the source of energy savings is not well accounted for. Consequently, our studies show that the reduction of server fan usage and not the reduction of cooling energy consumption is the main contributor of energy savings in a deep reinforcement learning-driven data center operating in the tropics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call