Abstract

High temperatures within a data center can cause a number of problems, such as increased cooling costs and increased hardware failure rates. To overcome this problem, researchers have shown that workload management, focused on a data center’s thermal properties, effectively reduces temperatures within a data center. In this paper, we propose a method to predict a workload’s thermal effect on a data center, which will be suitable for real-time scenarios. We use machine learning techniques, such as artificial neural networks (ANN) as our prediction methodology. We use real data taken from a data center’s normal operation to conduct our experiments. To reduce the data’s complexity, we introduce a thermal impact matrix to capture the spacial relationship between the data center’s heat sources, such as the compute nodes. Our results show that machine learning techniques can predict the workload’s thermal effects in a timely manner, thus making them well suited for real-time scenarios. Based on the temperature prediction techniques, we developed a thermal-aware workload scheduling algorithm for data centers, which aims to reduce power consumption and temperatures in a data center. A simulation study is carried out to evaluate the performance of the algorithm. Simulation results show that our algorithm can significantly reduce temperatures in data centers by introducing an endurable decline in performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call