The efficient prediction of cloud resource demand plays a crucial role in resource allocation and scheduling in cloud data centers, helping to optimize resource utilization and improve service quality. However, accurately predicting cloud resource demand poses challenges due to the failure of prediction models in real-world scenarios, such as extreme load peaks, and the limitation of computation burden on the global characterization capability. To effectively handle single-variable cloud resource load time series with multidimensional hidden factors, we propose a sample entropy-optimized variational model decomposition transformer (VMDSE-Tformer) for cloud resource scheduling. Hereby, we decompose the time series through variational model decomposition, and then reconstruct the subsequence collection using sample entropy calculation. Then, we use a class Transformer framework with a multi-head self-attention mechanism to learn deep features and obtain encoding representations of each component sequence. We conduct sufficient experiments on three benchmark datasets by comparing them with five state-of-the-art models. Notably, the MAPE of VMDSE-Tformer is improved by about 60% compared to LSTNet. The results demonstrate the superior performance of our VMDSE-Tformer in terms of predicting task sequence intensity, CPU, and RAM resource demand. Therefore, VMDSE-Tformer can serve as a powerful and efficient tool to predict resource demand in cloud data centers, with implications for more effective resource management and service delivery.
Read full abstract