Abstract

This paper focuses on minimizing energy consumption by servers in data centers. Server’s energy consumption can be impacted by numerous factors, such as the number of connected devices, the workload being processed, and the energy efficiency of the components.High energy consumption can be serious because of several reasons as it can impact the reliability of servers because high temperatures generated by energy consumption can lead to hardware failure and other technical issues. Therefore, reducing energy consumption in servers is important for improving the cost-effectiveness, sustainability, scalability, and reliability of data center operations. A type of reinforcement learning called Deep Q-Learning (DQL) can be used to address issues with server energy consumption. The basic idea behind DQL is to train an artificial agent, such as a neural network, to make decisions about energy consumption in real time. The agent is trained by frequently performing actions in a setting and earning rewards depending on the amount of energy consumed by specific actions. Over time, the agent learns which actions are most likely to lead to energy savings, and it can then be deployed to make real-time decisions about energy consumption in a server. Experimental results of the proposed research show an average of 66% power saving in the server’s consumption of energy using Deep Q-Learning (DQL).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call