Abstract

In-network caching is playing a more and more important role in today's network architectures, because of the explosive growth of data traffic due to the proliferation of mobile devices and demands for high-volume media content, as well as the development of low-latency applications, such as VR/AR and cloud gaming. The replication of popular contents in the caches located closer to end users than central servers, can significantly reduce backbone traffic, benefit request latency, and balance the load of servers. In this thesis, we study two problems in the field of network caching. In the first part, we study fair caching policies in multi-hop caching networks with arbitrary topology. We introduce a utility maximization framework to find caching decisions that reduce aggregate request routing cost in the network while considering fairness issues. The utility maximization problem is NP-hard and we propose two efficient approximation algorithms based on the submodularity of the objective function. By simulations, we show the performance of these two algorithms in different networks and discuss the effect of fairness on content distribution in the networks. In the second part, we study how caching can be utilized in mobile networks. In particular, we jointly optimize the user association decision and caching at both base stations (BSs) and gateways (GWs). The resulting problem is also NP-hard. We propose a polynomial-time algorithm based on concave approximation and pipage rounding that produces a solution within a constant factor of 1-1/e from the optimal. Simulation results over a 5G network show that the proposed algorithm outperforms schemes that combine cache-independent user association methods with traditional caching strategies (e.g. LRU) in terms of minimizing the aggregate routing cost and backhaul traffic while achieving a high data sum rate in the access network.--Author's abstract

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call