Abstract

Time series forecasting is one of the important branches of big data analysis and in this analysis, future time data is estimated from past or present time data with one or more variables. Resource estimation of machines in data centers is a growing area for research in time series analysis. Resource utilization estimation is an important consideration in achieving optimum resource provisioning in cloud computing. Due to the existence of long-range dependency in cloud workloads, traditional methods are not sufficient to develop predictive models. In this study, RNN, LSTM, BiLSTM and BiLSTM-RNN models for the estimation of resource usage in cloud workloads were analyzed with 2 different data decomposition methods as data-based and cluster-based, and the results were compared. As a result of the tests, the most successful result was obtained with the RNN model using the cluster-based separation method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call