Abstract
Recent years have witnessed the rapid growth of smart devices and mobile applications. However, mobile applications are typically computation-intensive and delay-sensitive, while User Devices (UDs) are usually resource-limited. Mobile Edge Computing (MEC) has been proposed as a promising paradigm to mitigate the tension, where UDs’ tasks could be executed either locally on itself or remotely on the edge server via computation offloading. Lots of efficient computation offloading scheduling approaches have been proposed, whereas most of them are based on centralized scheduling which could face troubles in large-scale MEC. To address the issue, this paper proposes a distributed scheduling framework by leveraging the idea of ‘centralized training and distributed scheduling’. Furthermore, the Actor-Critic reinforcement learning is adopted to build the framework where the Actor and Critic play the roles of distributed scheduling and centralized training, respectively. Extensive simulations are conducted and the experimental results verify the effectiveness and efficiency of the proposed framework.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.