Abstract
We investigate a computing task scheduling problem in space-air-ground integrated network (SAGIN) for Internet of remote Things (IoRT). In the considered scenario, the unmanned aerial vehicles (UAVs) collect computing tasks from IoRT devices and make offloading decisions, in which the tasks can be computed at the UAVs or offloaded to the low earth orbital (LEO) satellite. The optimization objective is to design offloading strategies for each UAV that maximum the number of tasks satisfies delay constraint and reduces the energy consumption of UAVs. We formulate this problem as a nonlinear integer optimization problem, and remodel it as a stochastic game to solve it. To copy with the dynamic and complexity environment, we propose a learning-based orbital edge offloading (LOEF) approach which can coordinate all UAVs to learn the optimal offloading strategies. This method is based on the Actor-Critic framework of multi-agent reinforcement learning, the Actor observes the environment and output the current offloading decision, the Critic evaluates the action of the Actor and coordinate the actions of all UAV to increase the reward of system. The simulation results show that the proposed offloading strategy converges fast, increases the number of satisfying the delay constraints tasks and the energy utilization of UAVs compared with the baseline strategies.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have