Abstract

With the rapid development of 5G technology, the need for a flexible and scalable real-time system for data processing has become increasingly important. By predicting future resource workloads, cloud service providers can automatically provision and deprovision user resources for the system beforehand, to meet service level agreements. However, workload demands fluctuate continuously over time, which makes their prediction difficult. Hence, several studies have proposed a technique called time series forecasting to accurately predict the resource workload. However, most of these studies focused solely on univariate time series forecasting; in other words, they only analyzed the measurement of a single feature. This study proposes an efficient multivariate autoscaling framework using bidirectional long short-term memory (Bi-LSTM) for cloud computing. The system framework was designed based on the monitor–analyze–plan–execute loop. The results obtained from our experiments on different actual workload datasets indicated that the proposed multivariate Bi-LSTM exhibited a root-mean-squared error (RMSE) prediction error 1.84-times smaller than that of the univariate one. Furthermore, it reduced the RMSE prediction error by 6.7% and 5.4% when compared with the multivariate LSTM and convolutional neural network-long short-term memory (CNN-LSTM) models, respectively. Finally, in terms of resource provisioning, the multivariate Bi-LSTM autoscaler was 47.2% and 14.7% more efficient than the multivariate LSTM and CNN-LSTM autoscalers, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call