Caching has attracted much attention recently because it holds the promise of scaling the service capability of radio access networks (RANs). We envision that caching will ultimately make next-generation RANs more than bit-pipelines and emerge as a multi-disciplinary area via the union with communications, pricing, recommendation, compression, and computation units. By summarizing cutting-edge caching policies, we trace a common root of their gains to the prolonged transmission time, which is then traded for higher spectral or energy efficiency. To realize caching, the physical layer and higher layers have to function together, with the aid of prediction and memory units, which substantially broadens the concept of cross-layer design to a multi-unit collaboration methodology. We revisit caching from a generalized cross-layer perspective, with a focus on its emerging opportunities, challenges, and theoretical performance limits. To motivate the application and evolution of caching, we conceive a hierarchical pricing infrastructure that provides incentives to network operators and users. To make RANs even more proactive, we design caching and recommendation jointly, showing a user what it might be interested in and what has been done for it. Furthermore, the user-specific demand prediction motivates edge compression and proactive MEC as new applications. The beyond-bit-pipeline RAN is a paradigm shift that brings with it many cross-disciplinary research opportunities.
Read full abstract