Abstract

Data center thermal management requires a good understanding of critical cooling airflow path. While CFD modeling excels at portraying airflow and temperature fields, it is often computationally intensive. Recent advancement in deep learning offers a rapid prediction of key parameters at several points of interest based on operation conditions of data centers. However, these models fail to deliver the comprehensive flow physics of an entire domain that CFD provides. This paper presents an innovative super-resolution approach, originating from computer vision, to model the air flow and temperature field in the cold aisle of a realistic data center. The proposed model reconstructs a high-fidelity CFD-generated flow field that typically takes several hours by using a low-fidelity CFD-generated flow field which only takes several minutes. The drastic reduction in computational time makes real-time prediction feasible, while providing detailed information for data center engineers to understand the flow field. Two variations of the framework are proposed and compared with the ground truth generated by the high-fidelity CFD model. A Mean Absolute Error less than 0.5 °C is achieved for air temperature prediction across all test scenarios, and less than 0.1 m/s for air velocity prediction. A sensitivity study is also conducted to determine the importance of input features.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call