Abstract
Due to the dynamic nature of cloud computing it is very important for a small to medium scale service providers to optimally assign computing resources and apply accurate prediction methods that enable the best resource management. The choice of an ideal quality of service (QoS) prediction method is one of the key factors in business transactions that help a service provider manage the risk of SLA violations by taking appropriate and immediate action to reduce occurrence, or avoid operations that may cause risk. In this paper we analyze ten prediction methods, including neural network methods, stochastic methods and others to predict time series cloud data and compare their prediction accuracy over five time intervals. We use Cascade Forward Backpropagation, Elman Backpropagation, Generalized Regression, NARX, Simple Exponential Smoothing, Simple Moving Average, Weighted Moving Average, Extrapolation, Holt-Winters Double Exponential Smoothing and ARIMA and predict resource usage at 1, 2, 3, 4 and 5 hours into the future. We use Root Means Square Error and Mean Absolute Deviation as a benchmark for their prediction accuracy. From the prediction results we observed that the ARIMA method provides the most optimal prediction results for all time intervals.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.