Abstract

With the vigorous development of the Internet, the network traffic of data centers has exploded, and at the same time, the network energy consumption of data centers has also increased rapidly. Existing routing algorithms only realize routing optimization through Quality of Service (QoS) and Quality of Experience (QoE), which ignores the energy consumption of data center networks. Aiming at this problem, this paper proposes an Ee-Routing algorithm, which is an energy-saving routing algorithm based on deep reinforcement learning. First, our method takes the energy consumption and network performance of the data plane in the software-defined network as the joint optimization goal and establishes an energy-efficient traffic scheduling scheme for the elephant flows and the mice flows. Then, we use Deep Deterministic Policy Gradient (DDPG), which is a deep learning framework, to achieve continuous and energy-efficient traffic scheduling for joint optimization goals. The training process of our method is based on a Convolutional Neural Network (CNN), which can effectively improve the convergence efficiency of the algorithm. After the algorithm training converges, the energy-efficient path weights of the elephant flows and the mice flows are output, and the balanced scheduling of routing energy-saving and network performance is completed. Finally, the results show that our algorithm has good convergence and stability. Compared with the DQN-EER routing algorithm, Ee-Routing improves the energy saving percentage by 13.93%, and compared with the EARS routing algorithm, Ee-Routing reduces the delay by 13.73%, increases the throughput by 10.91%, and reduces the packet loss rate by 13.51%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call