Abstract

In the Industrial Internet of Things (IIoT), various types of tasks are processed for the small quantity batch production. But there are many challenges due to the limited battery lifespan and computational capabilities of devices. To overcome the limitations, Mobile Edge Computing (MEC) has been introduced. In MEC, a task offloading technique to execute the tasks attracts much attention. A MEC server (MECS) has limited computational capability, which increases the burden on the server and a cellular network if a larger number of tasks are offloaded to the server. It can reduce the quality of service for task execution. Thus, offloading between nearby devices through device-to-device (D2D) communication is drawing attention. We propose the optimal task offloading decision strategy in the MEC and D2D communication architecture. We aim to minimize the energy consumption of devices and task execution delay under delay constraints. To solve the problem, we adopt Q-learning algorithm as one of Reinforcement Learning (RL). Simulation results show that the proposed algorithm outperforms the other methods in terms of energy consumption of devices and task execution delay.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call