Abstract
Proactive caching shows great potential to minimize peak download rates by caching popular data, in advance, at the edge. Fast-changing file features, such as fast-changing file popularities and fast-changing file contents (data freshness), represent a challenge for proactive caching if cache content update is much slower, which decreases the efficiency and usability of caching. We present a dynamic caching scheme that updates local user caches and optimizes the use of caching resources. The developed scheme index-code the updates with the delivery messages. The developed scheme is presented for a network with one cache-enabled server, that has a pool of files, communicating with <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$K$ </tex-math></inline-formula> cache-enabled receivers with requests limited to the server’s file pool. The developed scheme assumes partial knowledge of features variation. Asynchronous file delivery is assumed as a result of non-flexible receivers’ request timing. We show that the file delivery messages can be used to proactively and constantly update the receivers’ finite caches by index-coding the update messages with delivery messages at no additional rate-cost. We also show that this mechanism reduces the downloaded traffic and can be used to reduce other QoS metrics.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.