Cache plays a crucial role in improving system response time, alleviating server pressure, and achieving load balancing in various aspects of modern information systems. The data prefetch and cache replacement algorithms are significant factors influencing caching performance. Due to the inability to learn user interests and preferences accurately, existing rule-based and data mining caching algorithms fail to capture the unique features of the user access behavior sequence, resulting in low cache hit rates. In this article, we introduce BERT4Cache, an end-to-end bidirectional Transformer model with attention for data prefetch in cache. BERT4Cache enhances cache hit rates and ultimately improves cache performance by predicting the user's imminent future requested objects and prefetching them into the cache. In our thorough experiments, we show that BERT4Cache achieves superior results in hit rates and other metrics compared to generic reactive and advanced proactive caching strategies.
Read full abstract