Abstract

Mobile Edge Computing (MEC) can offload terminal tasks to edge servers to save its limited resources. We propose a dynamic offloading strategy for the edge computing system in a cellular network. The optimization problem of energy consumption and offloading cost minimization is established, and the offloading decision problem is transformed into a multi-label classification problem, making it easier for DRL to solve. An offloading decision model based on deep reinforcement learning is constructed, in which the system state is the relevant variable; the action is the offloading strategy. The deep reinforcement learning framework is implemented by a deep neural network. It learns binary offloading decision from experience replay and updates neural network parameters according to it. The neural network output results then will be quantified by the K-Nearest Neighbor (KNN) algorithm. The simulation results show that this method could not only solve the combinatorial optimization problem but also significantly simplify the computational process and the total resource overhead.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.