Abstract

With high-speed development of smart devices, abundant data are generated at the edge of the network. Edge computing has three characteristics: low response delay, high network traffic and low backhaul link pressure so as to process tons of data. Nevertheless, the latency of cloud services faces huge challenges due to the increasing requirements for timely content delivery and real-time user interaction. In order to hide the delay of user requirement, a cache prefetching strategy is proposed based on UCBM algorithm. The Markov chain can classify user behaviors and the probability of access files for the certain users can be calculated by Bayes network. Then, the next task of user access can be predicted. This model obviously improves prefetched file accuracy. In this paper, a cache replacement policy is proposed based on FHPA algorithm, which takes full advantage of the limited edge device space. Considering the file heat, the probability of the re-accessed cache file is evaluated. If the cache file is the smallest re-accessed probability, it will be evicted from cache. In a campus network, an edge computing environment is built for performance evaluation of our algorithms which significantly outperforms benchmark algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call