Abstract

User Equipment (UE) is equipped with limited cache resources that can be utilized to offload data traffic through device-to-device (D2D) communications. Data caching at a UE level has the potential to significantly alleviate data traffic burden from the backhaul link. Moreover, in wireless networks, users exhibit mobility that poses serious challenges to successful data transmission via D2D communications due to intermittent connectivity among users. Users’ mobility can be exploited to efficiently cache contents by observing connectivity patterns among users. Therefore, it is crucial to develop an efficient data caching mechanism for UE while taking into account users’ mobility patterns. In this work, we propose a mobility-aware data caching approach to enhance data offloading via D2D communication. First, we model users’ connectivity patterns. Then, contents are cached in UE’ cache resources based on users’ data preferences. In addition, we also take into account signal-to-interference and noise ratio (SINR) requirements of the users. Hence, our proposed caching mechanism exploits connectivity patterns of users to perform data placement based on users’ own demands and neighboring users to enhance data offloading via cache resources. We performed extensive simulations to investigate the performance of our proposed mobility-aware data caching mechanism. The performance of our proposed caching mechanism is compared to most deployed data caching mechanisms, while taking into account the dynamic nature of the wireless channel and the interference experienced by the users. From the obtained results, it is evident that our proposed approach achieves 14%, 16%, and 11% higher data offloading gain than the least frequently used, the Zipf-based probabilistic, and the random caching schemes in case of an increasing number of users, cache capacity, and number of contents, respectively. Moreover, we also analyzed cache hit rates, and our proposed scheme achieves 8% and 5% higher cache hit rate than the least frequently used, the Zipf-based probabilistic, and the random caching schemes in case of an increasing number of contents and cache capacity, respectively. Hence, our proposed caching mechanism brings significant improvement in data sharing via D2D communications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.