Abstract

Mobile Edge Cloud Computing (MECC), as a promising partial computing offloading solution, has provided new possibilities for compute-intensive and delay-sensitive mobile applications, which can simultaneously leverage edge computing and cloud services. However, designing resource allocation strategies for MECC faces an extremely challenging problem of simultaneously satisfying the end-to-end latency requirements and minimum resource allocation of multiple mobile applications. To address this issue, we comprehensively consider the randomness of computing request arrivals, service time, and dynamic computing resources. We model the MECC network as a two-level tandem queue consisting of two sequential computing processing queues, each with multiple servers. We apply a deep reinforcement learning algorithm called Deep Deterministic Policy Gradient (DDPG) to learn the computing speed adjustment strategy for the tandem queue. This strategy ensures the end-to-end latency requirements of multiple mobile applications while preventing overuse of the total computing resources of edge servers and cloud servers. Numerous simulation experiments demonstrate that our approach is significantly superior to other methods in dynamic network environments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.