Abstract

With caching at the base stations (BSs) in a cooperative manner, mobile edge caching (MEC) can alleviate the heavy backhaul burden and reduce the duplicated transmissions of content downloads, which has recently been considered a promising solution to cope with the exponentially increasing data traffic. However, how to maximize the storage utilization while reduce service latency and improve energy savings is still a critical issue in the large-scale mobile edge networks (MENs), since the growth of MENs in size as well as uneven users’ distribution make it difficult to determine which MEC should cache which content. To address this problem, we propose a hybrid collaborative caching (Hy-CoCa) design that jointly leverages local independent, intra-group collaborative and intra-network collaborative caching manners. MECs are clustered into disjoint groups, and then each MEC’s storage is partitioned into local, intra-group and intra-network portions. Local storage is reserved for storing the most popular contents to each MEC locally, so that users can directly fetch them from their associated MEC; Intra-group storage of different MECs inside the same group are regarded as an entity, which is used for collaboratively storing the middle popular contents, so as to reduce the probability for requesting contents from distant MECs; Intra-network storage of all MECs are leveraged for collaboratively storing less popular contents to different MECs in the entire MENs, as a means to improve the overall content diversity. Specifically, we first develop the Hy-CoCa’s framework to support users’ requests locally and conduct the construction of logical groups, considering users’ distribution and MECs’ proximity. Moreover, under the storage and popularity constraints, we formulate the storage allocation optimization problem to minimize average service latency and derive the optimal storage allocation. Furthermore, given an optimal storage allocation, we also formulate the request-aware content placement problem into an integer linear programming problem to maximize the overall energy savings. We prove the submodularity property of the objective function and propose a greedy algorithm with linear computational complexity, which can achieve (1−1/e)-optimality. Simulation results with real-world YouTube trace data demonstrate that our caching strategy can achieve 6% to 28% latency reduction and 9% to 75% energy savings improvement compared with other existing caching strategies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call