This paper considers computation offloading and service caching in a three-tier mobile cloud-edge computing structure, in which Mobile Users (MUs) have subscribed to the Cloud Service Center (CSC) for computation offloading services and paid related fees monthly or yearly, and the CSC provides computation services to subscribed MUs and charges service fees. Long transmission distance and communication resource shortage caused by the increasing number of offloaded MUs may make the CSC unable to satisfy the delay requirements of MUs. Hence, the CSC can purchase some computation and communication resources from Edge Servers (ESs) with limited caching capacities and computation resources to assist MUs in computation offloading. However, from the perspective of the CSC, it remains open to jointly optimize the strategies of computation offloading, service caching, and resource allocation to meet the delay requirements of MUs while reducing the cost of the CSC. Therefore, a novel Deep Reinforcement Learning-based Computation Offloading and Service Caching Mechanism, named DRLCOSCM is proposed to jointly optimize the offloading decision, service caching, and resource allocation strategies, so as to minimize the cost of the CSC while ensuring the delay requirements of MUs. In DRLCOSCM, the optimization problem is formulated as a Mixed Integer Non-Linear Programming (MINLP) problem, and an Asynchronous Advantage Actor-Critic-based (A3C-based) algorithm is proposed to solve the problem. The simulation results show that DRLCOSCM significantly outperforms the other baseline methods in different scenarios.
Read full abstract