Abstract

While cloud computing permits access to a large pool of experimental infrastructure, the most common form - virtual machines - has been shown to exhibit substantial deficits with respect to the accuracy of time measurements. In our ongoing work, we provide a detailed analysis of these deficits based on various machine configurations. Preliminary results indicate that not the use of virtualization as such, but the potentially uncontrollable utilization of the physical host is a decisive factor for the accuracy of time measurements.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.