Abstract

Mobile edge computing (MEC) networks have been recently adopted to accommodate the fast-growing number of mobile devices performing complicated tasks with limited hardware capability. Recently, edge nodes with communication, computation, and caching capacities are starting to be deployed in MEC networks. Due to the physical separation of these resources, efficient coordination and scheduling are important for efficient resource utilization and optimal network performance. In this paper, we study mobility load balancing for communication, computation, and caching-enabled heterogeneous MEC networks. Specifically, we propose to tackle this problem via a multi-agent deep reinforcement learning-based framework. Users served by overloaded edge nodes are handed over to less loaded ones, to minimize the load in the most loaded base station in the network. In this framework, the handover decision for each user is made based on the user’s own observation which comprises the user’s task at hand and the load status of the MEC network. Simulation results show that our proposed multi-agent deep reinforcement learning-based approach can reduce the time-average maximum load by up to 30% and the end-to-end delay by 50% compared to baseline algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.