Abstract

In this paper, we propose a novel optimization framework for a secure and green mobile edge computing (MEC) network, through a deep reinforcement learning approach, where the secure data transmission is threatened by the unmanned aerial vehicle (UAV). To alleviate the local burden on the computation, some computational tasks can be offloaded to the computational access points (CAPs), at the cost of price, transmission latency and energy consumption. By jointly reducing the price, latency and energy consumption, we propose a novel optimization framework for the secure MEC network, based on the deep reinforcement learning. Specifically, we firstly employ several optimization criteria, where criterion I minimizes the linear combination of price, latency and energy consumption, criterion II minimizes the price with the constrained latency and energy consumption, criterion III minimizes the latency with the constrained price and energy consumption, while criterion IV minimizes the energy consumption with the constrained price and latency. For each criterion, we then propose an optimization framework which can dynamically adjust the task offloading ratio and bandwidth allocation ratio simultaneously, where a novel feature extraction network is proposed to improve the training effect. Simulation results are finally demonstrated to verify the effective of the proposed optimization framework.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.