Abstract

When the cloud platform runs under heavy load for a long time, internal resources will be consumed and errors will accumulate continuously. As a result, the software aging phenomenon occurs, which ultimately degrades the performance and reliability of the software system. Aiming at the above problems, this paper proposes a hybrid model based on integrated variational mode decomposition, moving average free regression and long and short memory network (VMD-ARIMA-BILSTM) to predict the software aging problem. Firstly, the original resource utilization rate is decomposed into stationary time series and non-stationary time series by variational mode decomposition. Then, the advantages of moving average free regression and bidirectional long short-term memory network are used to predict stationary and non-stationary series respectively. Finally, the prediction results are reconstructed to obtain the final prediction results. Experimental results show that compared with single ARIMA and BI-LSTM, the hybrid model designed in this paper has higher prediction accuracy and faster convergence speed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.