Abstract

AbstractWith the maturity of 5G technology and the popularization of smart terminal devices, the applications running on mobile terminals are becoming more and more diversified. Most of them are complex, computationally intensive, and time-sensitive applications such as workflow and machine learning tasks. The traditional cloud computing model is far away from the mobile terminal and thus cannot meet the stringent requirements of these applications on delay and energy consumption. As a new computing model, mobile edge computing can better solve the above problems. Mobile edge computing sinks part of the computing and storage resources in the cloud to the edge of the network close to the mobile device. With computational offloading, complex applications are offloaded to nearby edge servers for execution, which leads to low delay and energy consumption. The existing researches mainly focus on independent task offloading in mobile edge computing, and thus they are not suitable for workflow tasks offloading with dependence on mobile edge computing. This paper proposes a multiple workflows offloading strategy based on deep reinforcement learning in mobile edge computing with the goal of minimizing the overall completion time of multiple workflows and the overall energy consumption of multiple user equipments. We evaluate the performance of the proposed strategy by simulation experiments based on real-world parameters. The results show that the proposed strategy performs better than other alternatives in terms of the overall completion time and the overall energy consumption.KeywordsMultiple workflows offloadingMulti-objective optimizationMulti-agent DDPG

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call