Abstract
We propose a novel demand-driven caching framework, called cache-on-demand (CoD). In CoD, intermediate/final answers of existing running queries are viewed as virtual caches that can be materialized if they are beneficial to incoming queries. Such an approach is essentially nonspeculative: the exact cost of investment and the return on investment are known, and the cache is certain to be reused! We address several issues for CoD to be realized. We also propose three optimizing strategies: Conform-CoD, Scramble-CoD, and Integrated-CoD. Conform-CoD and Scramble-CoD are based on a two-phase optimization framework, while Integrated-CoD operates in a single-phase framework. We conducted extensive performance study to evaluate the effectiveness of these algorithms. Our results show that all the CoD-based schemes can provide substantial performance improvement when compared with a predictive scheme and a no-caching scheme.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Knowledge and Data Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.