Abstract

In this paper, we present a predictive prefetching mechanism that is based on probability graph approach to perform prefetching between different levels in a parallel hybrid storage system. The fundamental concept of our approach is to invoke parallel hybrid storage system’s parallelism and prefetch data among multiple storage levels (e.g. solid state disks, and hard disk drives) in parallel with the application’s on-demand I/O reading requests. In this study, we show that a predictive prefetching across multiple storage levels is an efficient technique for placing near future needed data blocks in the uppermost levels near the application. Our PPHSS approach extends previous ideas of predictive prefetching in two ways: (1) our approach reduces applications’ execution elapsed time by keeping data blocks that are predicted to be accessed in the near future cached in the uppermost level; (2) we propose a parallel data fetching scheme in which multiple fetching mechanisms (i.e. predictive prefetching and application’s on-demand data requests) can work in parallel; where the first one fetches data blocks among the different levels of the hybrid storage systems (i.e. low-level (slow) to high-level (fast) storage devices) and the other one fetches the data from the storage system to the application. Our PPHSS strategy integrated with the predictive prefetching mechanism significantly reduces overall I/O access time in a hybrid storage system. Finally, we developed a simulator to evaluate the performance of the proposed predictive prefetching scheme in the context of hybrid storage systems. Our results show that our PPHSS can improve system performance by 4% across real-world I/O traces without the need of using large size caches.

Highlights

  • In data-intensive computing systems, researchers have proposed a wide variety of prefetching techniques toHow to cite this paper: Al Assaf, M.M. (2015) Predictive Prefetching for Parallel Hybrid Storage Systems

  • The following list summarizes the major research contributions made in this paper: -To reduce I/O delays and application’s execution elapsed time in a hybrid storage system, we propose new predictive prefetching approaches to work in parallel with the application on-demand I/O reading requests

  • The system consists of an application that contentiously issues on-demand I/O reading requests for data blocks that are stored in a parallel hybrid storage system

Read more

Summary

Introduction

In data-intensive computing systems, researchers have proposed a wide variety of prefetching techniques to. (2015) Predictive Prefetching for Parallel Hybrid Storage Systems. Al Assaf preload data from disks prior to the data accesses in order to solve the I/O bottleneck problem (see [1] [2]). Existing prefetching techniques can be categorized into two camps—predictive and informed. Predictive prefetching approaches predict the future I/O access patterns based on historical I/O accesses of applications [3], whereas informed prefetching techniques make preloading decisions based on applications’ future access hints [4]. We focus on predictive prefetching schemes and investigate performance impact of predictive prefetching on hybrid storage systems. We predictively prefetch the data from lower levels (slow) to the uppermost one (fast) in hybrid storage system’s levels to improve application’s performance and reduce its execution elapsed time

Motivations
Contributions
Roadmap
Literature Review
Hybrid Storage Systems
Predictive Prefetching
Informed Prefetching
Prefetching in Hybrid Storage Systems
System Design
The Design of Hardware and Software Architectures
System Assumptions
Data Initial Placement
A Prefetching for Data Transfers
Block Size
The Use of LASR Traces
The PPHSS Algorithm
Definitions
The Probability Graph Predictive Prefetching Approach
PPHSS Implementation
System Parameters’ Validations
The PPHSS Simulation
Findings
Conclusion & Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.