Abstract
We introduce a framework and provably-efficient schemes for ‘fresh’ caching at the (front-end) local cache of content that is subject to ‘dynamic’ updates at the (back-end) database. We start by formulating the hard-cache-constrained problem for this setting, which quickly becomes intractable due to the limited cache. To bypass this challenge, we first propose a flexible time-based-eviction model to derive the average system cost function that measures the system’s cost due to the service of aging content in addition to the regular cache miss cost. Next, we solve the cache-unconstrained case, which reveals how the refresh dynamics and popularity of content affect optimal caching. Then, we extend our approach to a soft-cache-constrained version, where we can guarantee that the cache use is limited with arbitrarily high probability. The corresponding solution reveals the interesting insight that ‘whether to cache an item or not in the local cache?’ depends primarily on its popularity level and channel reliability, whereas ‘how long the cached item should be held in the cache before eviction?’ depends primarily on its refresh rate. Moreover, we investigate the cost-cache saving trade-offs and prove that substantial cache gains can be obtained while also asymptotically achieving the minimum cost as the database size grows.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.