Abstract

Garbage collection (GC) is one of main causes of the long-tail latency problem in storage systems. Long-tail latency due to GC is more than 100 times greater than the average latency at the 99th percentile. Therefore, due to such a long tail latency, real-time systems and quality-critical systems cannot meet the system requirements. In this study, we propose a novel key state management technique of reinforcement learning-assisted garbage collection. The purpose of this study is to dynamically manage key states from a significant number of state candidates. Dynamic management enables us to utilize suitable and frequently recurring key states at a small area cost since the full states do not have to be managed. The experimental results show that the proposed technique reduces by 22--25% the long-tail latency compared to a state-of-the-art scheme with real-world workloads.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.