Abstract

A large number of resource access requests from heterogeneous terminals bring severe challenges to ensuring the performance and efficiency of Tranparent Computing server. Caching mechanism plays a significant role in performance improvement of transparent computing systems. Nevertheless, the existing caching mechanisms do not take into account the complex and volatile runtime context in the server-side, such as the change in users’ access requirements for the resources and server performance status, so that their cache scheduling strategies are lack of flexibility and diversity.Thus, in this paper, we proposed a software defined cache scheduling framework that can dynamically and flexibly schedule appropriate caching policies according to the monitored information to achieve optimal caching performance for transparent computing server. First, we constructed a multi-layer and linked virtual disk storage model and its resource access mechanism. Then, based on this storage model, in order to perceive changes in the users’ demand for server resources, we adopted information entropy to model and analyze the user access behavior, and predict it with exponential smoothing algorithm. Finally, the cache scheduling is defined as an optimization problem from two aspects of prefetching and replacement, and some heuristic algorithms are used to obtain the approximate optimal solutions based on the conclusions of user access behavior analysis and prediction. We made experiments on the real data and tested the effectiveness of our approach, and the results show that our approach can achieve better caching performance than traditional methods, thus improving the service quality and user experience of transparent computing effectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call