Abstract

SummaryThe Internet of Things (IoT) and Cloud Computing are two novel paradigms both rapidly evolving in their particular areas of application. The former is enabled through several technologies ranging from communication systems to distributed intelligence, whereas the latter provides the means for massively parallel computation on demand. Therefore, we can include Cloud Computing as an enabling factor in the greater picture of the IoT. Given the complexity of IoT's computing concepts, it would be prudent to assume that this complexity will eventually provide computing workloads that are different in kind to the current ones. Thus, it becomes important to study how current scheduling optimization techniques can be adapted to such computing tasks. In this study, we have evaluated the application of simulated annealing in a multi‐cloud system serving a workload of processes with low parallelism but with high arrival rates and highly variant run‐times. A discrete event simulator was used in order to assess both the performance and the cost of the system. Simulation results indicate that significant gains both in performance and in cost can be achieved using the proposed scheduling technique in this context. Copyright © 2013 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call