Abstract

The explosive growth of the distributed computing resources in mobile edge computing (MEC) create a necessity to have a reasonable controller to ensure efficient utilization of distributed computing resources. The software defined networks (SDNs) capabilities can be used in MEC environment to reduce the energy consumption of user device without service disruptions. SDN, which is not proposed specifically for edge computing, can in fact serve as an enabler to let the real potential of edge computing be achieved and to lower the complexity barriers involved. This paper investigates resource allocation problem in wireless MEC by aiming to save the battery power of user device. Learning from experience plays an important role in a SDN based MEC infrastructure where reinforcement learning (RL) takes a long term goal into consideration besides immediate reward, which is very important for a dynamic environment. A novel software defined edge cloudlet (SDEC) based RL optimization framework is proposed in this paper to tackle the energy minimization problem in wireless MEC. Specifically, Q-learning and cooperative Q-learning based RL schemes are proposed for the intractable problem. Simulation results reveal that the proposed scheme achieves superior performance in saving the battery power of a user device compared to other benchmark methods such as Q-learning with a random algorithm and Q-learning with epsilon greedy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call