Abstract

Technological advancements have led to an exponential growth in input–output intensive data that demands high-performance computing. To access the data, the frequency of firing the same query is quite high. Hence, predicting and prefetching these frequently-used queries can enhance the performance in terms of execution time and cache hit ratio. Therefore, a prediction-based framework has been proposed which initially, generates memory traces to identify the data usage patterns in terms of query frequency. The future query requests have been predicted and classified using an ensembled approach that yields 87.5% accuracy. It successfully reduces the error rate up to 11%. Furthermore, the predicted classified results have been tagged as hot and cold data on the basis of threshold frequency. The identified hot data has been prefetched into the cache that provides 96.5% cache hits with 9.7% decreased execution time. Hybrid cache replacement algorithm has been utilized to keep the cache updated with the hot data. The experimental results have been compared with the existing frameworks and benchmarks, which shows 6.8% improvement in accuracy with 9% increment in cache hits.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call