Abstract

Device-to-Device (D2D) communication with caching technology has emerged as a promising technique for offloading the traffic and boosting the throughput of the fifth generation (5G) cellular networks. The combined impact of cache memory size of user equipments (UEs) and content sizes, which are two crucial factors in D2D-assisted caching networks, were usually ignored in the existing researches. In this paper, an optimization algorithm is proposed to maximize the cache hit probability and cache-aided throughput, with the consideration of various cache memory sizes of UEs and content sizes. Firstly, users are grouped according to the cache memory sizes and the content sizes. Then the general mathematical expressions for the optimization of cache hit probability and cache-aided throughput with the constraints of cache memory sizes and content sizes are obtained. Subsequently, a Packet Cache Strategy (PCS) algorithm is proposed to obtain the caching probability matrix with the maximum cache hit probability and cache-aided throughout by taking user caching probability of a file as a variable. Finally, numerical results show that the sizes of the requested files affect the caching willingness of users, and the proposed PCS can achieve the highest cache hit probability and the best cache-aided throughput comparing with two other existing methods.

Highlights

  • With the rapid explosion of data volumes and content diversities, data traffics are increasingly concentrated to hotspot [1]

  • Even though researchers have revealed that user equipments (UEs)’ cache memory size and the desired file size both affect the content dissemination process, normalized content sizes and normalized UEs’ capacities are generally assumed in the existing literatures, and the content popularity is regarded as the only main factor affecting user caching probability, ignoring the combined impact of the above-mentioned two factors constraints on the performance of D2D-assisted caching networks

  • On consideration of the combined impact of both content sizes and capacities restriction, we design a D2D-assisted caching algorithm, in which UEs are classified into various groups according to their cache memory sizes, and the contents are grouped by sizes, the user caching probability in each group is optimized to maximize the cache hit probability and cache-aided throughput

Read more

Summary

INTRODUCTION

With the rapid explosion of data volumes and content diversities, data traffics are increasingly concentrated to hotspot [1]. Even though researchers have revealed that UEs’ cache memory size and the desired file size both affect the content dissemination process, normalized content sizes and normalized UEs’ capacities are generally assumed in the existing literatures, and the content popularity is regarded as the only main factor affecting user caching probability, ignoring the combined impact of the above-mentioned two factors constraints on the performance of D2D-assisted caching networks. On consideration of the combined impact of both content sizes and capacities restriction, we design a D2D-assisted caching algorithm, in which UEs are classified into various groups according to their cache memory sizes, and the contents are grouped by sizes, the user caching probability in each group is optimized to maximize the cache hit probability and cache-aided throughput. We propose a Packet Cache Strategy (PCS) algorithm based on the capacities restriction and the desired content with different sizes to find out the optimal solution for the cache hit probability and the cache-aided throughput.

NETWORK MODEL
CACHE-AIDED THROUGHPUT
CACHE HIT PROBABILITY
NUMERICAL RESULTS
1: Initialization
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.