Abstract

While the effectiveness of fog computing in Internet of Things (IoT) applications has been widely investigated in various studies, there is still a lack of techniques to efficiently utilize the computing resources in a fog platform to maximize Quality of Service (QoS) and Quality of Experience (QoE). This paper presents a resource management model for service placement of distributed multitasking applications in fog computing through mathematical modeling of such a platform. Our main design goal is to reduce communication between the candidate nodes hosting different task modules of an application by selecting a group of nodes near each other and as close to the source of the data as possible. We propose a method based on a greedy principle that demonstrates a highly scalable and near-optimal performance for resource mapping problems for multitasking applications in fog computing networks. Compared with the commercial Gurobi optimizer, our proposed algorithm provides a mapping solution that obtains 93% of the performance, attributed to a higher communication cost, while outperforming the reference method in terms of the computing speed, cutting the mapping execution time to less than 1% of that of the Gurobi optimizer.

Highlights

  • AND BACKGROUNDT HE past decade has witnessed a wide deployment of the Internet of Things (IoT) technology in various application domains, and its pervasive role will continue to strengthen in the future [1]

  • We introduced a model for handling IoT requests with multitasking applications in a fog computing network and an analytical model to formulate the resource management problem from a communication cost perspective

  • We proposed an algorithm based on a greedy principle to minimize the cost

Read more

Summary

Introduction

AND BACKGROUNDT HE past decade has witnessed a wide deployment of the Internet of Things (IoT) technology in various application domains, and its pervasive role will continue to strengthen in the future [1]. Data is not processed in the proximity of a sensor but transferred as it is to a server that might be located in the cloud. Transferring the constantly increasing amount of information from sensors to the cloud is not feasible. To overcome the intrinsic limitation of centralized data processing in cloud computing, a new paradigm called fog computing was introduced. Fog computing is characterized by heterogeneity, dynamicity, mobility, and geographical distribution that complement cloud computing services, providing local processing and faster response for delay-sensitive applications. It is considered a derivative of cloud computing that extends its services to the network’s edge [2]

Objectives
Methods
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.