Abstract

A graph engine should possess adaptability to ensure efficient processing despite a variety of graph data and algorithms. In terms of out-of-core graph engines, which exploit a hierarchical memory structure, an adaptive caching scheme is necessary to sustain effectiveness of memory usage. A caching policy selectively stores data likely to be used in the upper-layer memory based on its own expectation about the future workload. However, the graph workload contains a complexity of memory access according to graph data, algorithm, and configurations. This makes it difficult for a static caching policy to respond to the changes in workload. In this paper, we propose a graph-adaptive caching scheme which ensures consistent effectiveness under the changing workloads. Our caching scheme employs an adaptive policy that responds to changes in real-time workloads. To detect the changes, we adopt the competition procedures between two contrasting properties—locality and regularity—that appear in graph workloads. In addition, we combine two window adjustment techniques to alleviate the overhead from competition procedures. The proposed caching scheme is applicable to different types of graph engines, achieving better efficiency in memory usage. Our experimental results prove that our scheme improves the performance of graph processing by up to 65% compared to existing schemes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.