Abstract
Cloud computing has been developed as a means to allocate resources efficiently while maintaining service-level agreements by providing on-demand resource allocation. As reactive strategies cause delays in the allocation of resources, proactive approaches that use predictions are necessary. However, due to high variance of cloud host load compared to that of grid computing, providing accurate predictions is still a challenge. Thus, in this paper we have proposed a prediction method based on Long Short-Term Memory Encoder–Decoder (LSTM-ED) to predict both mean load over consecutive intervals and actual load multi-step ahead. Our LSTM-ED-based approach improves the memory capability of LSTM, which is used in the recent previous work, by building an internal representation of time series data. In order to evaluate our approach, we have conducted experiments using a 1-month trace of a Google data centre with more than twelve thousand machines. Our experimental results show that while multi-layer LSTM causes overfitting and decrease in accuracy compared to single-layer LSTM, which was used in the previous work, our LSTM-ED-based approach successfully achieves higher accuracy than other previous models, including the recent LSTM one.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have