Abstract

The article develops a mathematical model of the process of consuming computing resources when they are incompletely released using virtualization technology. The mathematical model is presented in the form of a queuing system with an unlimited number of devices, with a simple incoming flow of applications and exponential service time for them on the devices. The study of the model is performed by the methods of queuing theory. Using the method of moments, the main probabilistic characteristics of the amount of free resources are found mathematical expectation and variance. The proposed mathematical model of the process of consuming computing resources allows us to evaluate and predict the process of changing the amount of free resources of a virtual machine in time and analyze its performance parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call