Abstract

In this paper, we present a modeling technique of using Jackson's network theorem to characterize the performance of mashup multiple servers in cloud computing environments. The key challenge in providing new mashup mobile applications in cloud computing, such as the real-time location-based services, is to evaluate the overall delay resulting from integrating multiple cloud servers. Furthermore, the number of virtual machines (VM) will affect the quality of service (QoS) for mobile applications with various traffic loads. However, an effective analytical model to characterize both the effects of integrating multiple cloud servers and scalable VMs is rarely seen in the literature. The proposed multi-cloud mashup analytical model can calculate the service waiting time for various numbers of VMs and different arrival rates. Through simulations and analysis, we show that for various numbers of VMs and traffic loads, the proposed model can accurately predict the breakpoint where the waiting time in mashup cloud servers will sharply increase. Hence, the proposed mashup multi-cloud analytical model can facilitate the management of resource in future cloud data centers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call