Abstract

The trend of the Internet of Everything is deepening, and the amount of data that needs to be processed in the network is growing. Using the edge cloud technology can process data at the edge of the network, lowering the burden on the data center. When the load of the edge cloud is large, it is necessary to apply for more resources to the cloud service provider, and the resource billing granularity affects the cost. When the load is small, releasing the idle node resources to the cloud service provider can lower the service expenditure. To this end, an on-demand resource provision model based on service expenditure is proposed. The demand for resources needs to be estimated in advance. To this end, a load estimation model based on ARIMA model and BP neural network is proposed. The model can estimate the load according to historical data and reduce the estimation error. Before releasing the node resources, the user data on the node need to be migrated to other working nodes to ensure that the user data will not be lost. In this paper, when selecting the migration target, the three metrics of load balancing, migration time consumption and migration cost of the cluster are considered. A data migration model based on load balancing is proposed. Through the comparison of experimental results, the proposed methods can effectively reduce service expenditure and make the cluster in a state of load balancing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call