Abstract
With the explosive growth of mobile multimedia traffic, content caching is seen as an effective solution to alleviate the heavy traffic burden on back-haul and front-haul and to improve the quality of real-time data services. The concept of ”Cache as a Service (CaaS)” is a framework for caching virtualization for mobile cloud-based networks. Consequently, contents can be distributed and stored based on their popularity, traffic diversity, and diverse user demands. In this paper, we mainly consider the problem of allocating computing resources for very low latency services, as well as high data rate services that require sufficient spectral resources. We plan to exploit all the benefits of a completely virtualized environment, where mobile virtual network operators (MVNOs) and virtual service providers (VSPs) are connected in the Cloud through network as a service (NaaS) using distributed infrastructure as a service (IaaS). VSPs provide service to (Internet of things) IoT devices including software as a service (SaaS). The IoT devices, as a service requester will take advantage of emerging caching techniques (CaaS) to accomplish the on-demand low-latency services that require a large amount of computing resources and a high bandwidth. In order to satisfy the quality of service (QoS) requirements, the radio access network RAN as a Service (RANaaS) is the pivot of this environment that will allocate dynamically networking, computing and storage resources according to the required services in terms of latency and throughput. Thus we propose a many-to-many matching game between the sets of IoT devices and the set of virtual service providers (VSPs). To solve this game, we exploit the deferred acceptance algorithm that enables the players to self-organize into a stable matching and a reasonable number of algorithm iterations. The goal of the proposed manyto-many game theory approach is to optimize the caching spaces that VSP exploit in the edge to store files or software required by IoT devices. Simulation results has demonstrated that our proposed matching strategy coupled to CaaS caching capabilities on distributed F-RAN significantly outperforms the traditional caching strategies in terms of the cache hit ratio, average latency and back-haul traffic load.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.