Abstract

Wireless Mobile Edge Computing (MEC) is one of several promising models emerging in recent years. A wireless powered MEC network is researched in this paper. The proposed online offload framework based on deep reinforcement learning (DROO) draws lessons from the previous offload experience. As such, it fulfills the need to solve complex problems like MIP. The computational complexity will not surge with the increase of network size. DROO decomposes the original optimization problem into a secondary problem from design to unloading decision and a resource allocation sub-problem. It is suitable for the space without interruption, and there is no need to discretize the channel gain, so as to avoid the disaster caused by the problem of dimension. By studying the simulation results, the DROO algorithm is found that it can achieve almost perfect performance by designing and calculating the method at the current stage, but it also needs to reduce CPU execution latency by at least one order of magnitude, so as to ensure that the realtime system optimization is fading. It is really feasible to use MEC networks when wirelessly powered in the environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.