Abstract
Caching is an efficient way to reduce network traffic congestion during peak hours by storing some content at the users’ local caches. For the shared-link network with end-user-caches, Maddah-Ali and Niesen proposed a two-phase coded caching strategy. In practice, users may communicate with the server through intermediate relays. This paper studies the tradeoff between the memory size M and the network load R for the networks where a server with N files is connected to H relays (without caches), which in turn are connected to K users equipped with caches of M files. When each user is connected to a different subset of r relays, i.e., K = (Hr), the system is referred to as a combination network with end-user-caches. In this work, converse bounds are derived for the practically motivated case of uncoded cache contents, that is, bits of the various files are directly pushed into the user caches without any coding. In this case, once the cache contents and the users’ demands are known, the problem reduces to a general index coding problem. This paper shows that relying on a well-known “acyclic index coding converse bound” results in converse bounds that are not tight for combination networks with end-user-caches. A novel converse bound that leverages the network topology is proposed, which is the tightest converse bound known to date. As a result of independent interest, an inequality that generalizes the well-known sub-modularity of entropy is derived. Several novel caching schemes are proposed, based on the Maddah-Ali and Niesen cache placement. These schemes leverage the structure of the combination network or/and perform interference elimination at the end-users. The proposed schemes are proved: (i) to be (order) optimal for some (N, M, H, r) parameters regimes under the constraint of uncoded cache placement, and (ii) to outperform the state-of-the-art schemes in numerical evaluations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.