Abstract

With the popularity of smart mobile equipment, the amount of data requested by users is growing rapidly. The traditional centralized processing method represented by the cloud computing model can no longer satisfy the effective processing of large amounts of data. Therefore, the mobile edge computing (MEC) is used as a new computing model to process the big growing data, which can better meet the service requirements. Similar to the task scheduling problem in cloud computing, an important issue in the MEC environment is task offloading and resource allocation. In this paper, we propose an adaptive task offloading and resource allocation algorithm in the MEC environment. The proposed algorithm uses the deep reinforcement learning (DRL) method to determine whether the task needs to be offloaded and allocates computing resources for the task. We simulate the generation of tasks in the form of Poisson distribution, and all tasks are submitted to be processed in the form of task flow. Besides, we consider the mobility of mobile user equipment (UE) between base stations (BSs), which is closer to the actual application environment. The DRL method is used to select the suitable computing node for each task according to the optimization objective, and the optimal strategy for solving the objective problem is learned in the algorithm training process. Compared with other comparison algorithms in different MEC environments, our proposed algorithm has the best performance in reducing the task average response time and the total system energy consumption, improving the system utility, which meets the profits of users and service providers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call