Abstract

The widespread adoption of Industrial Internet of Things (IIoT)-based applications has driven the emergence and development of cloud-related computing paradigms with the ability to seamlessly leverage cloud resources. Heterogeneous resources, mobility factors in IoT, and dynamic behavior make it challenging for the corresponding virtual machine (VM) scheduling problem to address the processing effectiveness of application requests in these kinds of cloud environments. Based on reinforcement learning theory, this article proposes an online VM scheduling scheme (OSEC) for joint energy consumption and cost optimization that divides the scheduling process into two parts: VM allocation and VM migration. First, all the VMs and the physical machines (PMs) are regarded as a set of states and actions in the cloud environment, and the Q-learning feedback is used to achieve the iterative computation of Q-values to obtain the optimal parallel allocation sequence for multiple VMs. Then, VMs are migrated among the active PMs according to a grouping policy and the best-fit principle to achieve dynamic consolidation of the resources in the data center. Finally, experimental results show that compared with state-of-the-art algorithms under different conditions, the proposed method reduces energy consumption by approximately 18.25%, VM execution costs by approximately 21.34%, and service level agreement (SLA) violations by approximately 90.51%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call