Abstract
Resource usage prediction is an important aspect for achieving optimal resource provisioning in cloud. The presence of long range dependence in cloud workloads makes conventional time series resource usage prediction models unsuitable for prediction. In this paper, we proposed to use multivariate long short term memory (LSTM) models for prediction of resource usage in cloud workloads. We analyze and compare the predictions of LSTM model and bidirectional LSTM model with fractional difference based methods. The proposed LSTM models have been evaluated and compared with the state-of-the-art existing methods on Google cluster trace [1]. The experimental results show that the proposed algorithms outperform state-of-the-art algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have