Abstract

Energy consumption in data center is currently the main focus of many large-scale enterprises and cloud service providers. Dynamic virtual machine (VM) consolidation technologies are widely used to improve resource utilization of data centers and reduce energy. They try to identify the poorly utilized physical hosts and make the most of the resources. Then idle hosts can be switched to sleep or active mode, meanwhile considering the real-time fluctuation of service workload. In this paper, we propose a Reinforcement Learning (RL) based Virtual Machine Consolidation (RL-VMC) framework with an application to the cloud data center operation. The RL-VMC method uses an agent corresponding to a VM planner to interact with the environment, which encapsulates the data center running state, and learns the optimal policy to determine the migration mapping from VMs to physical hosts. Specifically, we adopt the on-policy method State-Action-Reward-State-Action (SARSA) in RL-VMC, which learns from experience to get the optimal VMs migration strategy and manage the host power mode as well. Additionally, we use the PBRS technology to speed up the convergence of the RL-VMC. Experimental results show that the RL-VMC method can adapt to the dynamic workload maintaining an improved balance between energy consumption and Service Level Agreements (SLA).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call