Abstract

Mobile edge computing (MEC) enables terminals to migrate their tasks to edge servers instead of the central cloud for efficient execution. However, most researches on task offloading are limited to binary offloading for atomic tasks with a single edge server, while in practice, serial tasks of computation-intensive applications are more important. Therefore, we jointly study the task offloading and resource allocation for serial tasks in the multi-terminal multi-server scenario. A serial task is divided into multiple sub-tasks that are executed sequentially, leading to a better utilization of fragmented resources. The offloading mechanism of inter-coupled terminals is formulated as a noncooperative stochastic game, with evaluation indexes defined by the joint task priority, average task delay and energy consumption. Aiming at minimizing the long-term cost of the whole system, we adopt a multi-agent reinforcement learning (MARL) algorithm with dynamically adjusted offloading strategies, subchannels, transmit power, and allocated resources with only the partial state information. Simulation results demonstrate the feasibility of the proposed algorithm to solve the formulated problem in a distributed way. Compared with the other five benchmark algorithms, it has better system cost performance, and can schedule delay-sensitive tasks with higher priority earlier on the basis of a lower task failure rate.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.