Abstract

Recently, we have witnessed an expeditious growth in the intelligence and processing ability of user equipment accompanied by the explosive increase of wireless data and traffic. With such a huge demand, and a shortage of resources to fulfill that need, device-to-device (D2D) communication has surfaced as a propitious solution to reduce costs associated with backhaul links by collaboratively caching files. This paper explores the existing content caching frameworks in D2D communication environments. Further, it proposes two novel frameworks -one leveraging a combination of recurrent and deep neural networks and another based on the latest deep language models that use attention mechanisms known as transformers. The developed frameworks require minimum apriori knowledge about the environment and utilize the growth of user data to achieve performance enhancement. Our experiments show that the proposed frameworks are adaptive in nature and learn from the historical data of the environment. Further, we achieve an overall 25% increase in the D2D cache hit rate over a recently proposed framework that uses neural networks for collaborative filtering (NCF) to make caching decisions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call