Abstract

Multi-access edge computing (MEC) is a novel computing paradigm that leverages nearby MEC servers to augment the computational capabilities of users with limited computational resources. In this paper, we investigate the computational offloading problem in multi-user multi-server MEC systems with energy harvesting, aiming to minimize both system latency and energy consumption by optimizing task offload location selection and task offload ratio.We propose a hierarchical computational offloading strategy based on multi-agent reinforcement learning (MARL). The proposed strategy decomposes the computational offloading problem into two sub-problems: a high-level task offloading location selection problem and a low-level task offloading ratio problem. The complexity of the problem is reduced by decoupling. To address these sub-problems, we propose a computational offloading framework based on multi-agent proximal policy optimization (MAPPO), where each agent generates actions based on its observed private state to avoid the problem of action space explosion due to the increasing number of user devices. Simulation results show that the proposed HDMAPPO strategy outperforms other baseline algorithms in terms of average task latency, energy consumption, and discard rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call