Abstract

The mobile edge computing architecture successfully solves the problem of high latency in cloud computing. However, current research focuses on computation offloading and lacks research on service caching issues. To solve the service caching problem, especially for scenarios with high mobility in the Sensor Networks environment, we study the mobility-aware service caching mechanism. Our goal is to maximize the number of users who are served by the local edge-cloud, and we need to make predictions about the user’s target location to avoid invalid service requests. First, we propose an idealized geometric model to predict the target area of a user’s movement. Since it is difficult to obtain all the data needed by the model in practical applications, we use frequent patterns to mine local moving track information. Then, by using the results of the trajectory data mining and the proposed geometric model, we make predictions about the user’s target location. Based on the prediction result and existing service cache, the service request is forwarded to the appropriate base station through the service allocation algorithm. Finally, to be able to train and predict the most popular services online, we propose a service cache selection algorithm based on back-propagation (BP) neural network. The simulation experiments show that our service cache algorithm reduces the service response time by about 13.21% on average compared to other algorithms, and increases the local service proportion by about 15.19% on average compared to the algorithm without mobility prediction.

Highlights

  • With the rapid development of large-scale cloud computing, more and more service providers have chosen to deploy content and services on the cloud

  • We propose a service cache selection algorithm based on BP neural network

  • We propose a service cache selection algorithm based on BP neural network for the edge-cloud server

Read more

Summary

Introduction

With the rapid development of large-scale cloud computing, more and more service providers have chosen to deploy content and services on the cloud. Speaking, such services are highly delay-sensitive, and have complex computing requirements For this problem, the researchers propose a mobile edge computing model, i.e., deploying an edge-cloud server on the base station which is close to the end user to perform some computing jobs. The researchers propose a mobile edge computing model, i.e., deploying an edge-cloud server on the base station which is close to the end user to perform some computing jobs It can significantly reduce network transmission delay [4]. We propose a service cache selection algorithm based on BP neural network for the edge-cloud server It uses the historical information and existing service requests to predict the most popular services online.

Related Work
System Model
User Location Prediction and Service Allocation
Idealized Geometric Model
Mining Frequent Patterns
Service Allocation Algorithm
Service Cache Selection Algorithm
BP Neural Network Model
Model Performance Analysis
Experiment and Performance Evaluation
Experimental Environment
Performance Comparison
Findings
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.