Abstract

With the advent of the Internet-of-Things (IoT), end-devices have been served as sensors, gateways, or local storage equipment. Due to their scarce resource capability, cloud-based computing is currently a necessary companion. However, raw data collected at devices should be uploaded to a cloud server, taking a significantly large amount of network bandwidth. In this paper, we propose an on-demand computation offloading architecture in fog networks, by soliciting available resources from nearby edge devices and distributing a suitable amount of computation tasks to them. The proposed architecture aims to finish a necessary computation job within a distinct deadline with a reduced network overhead. Our work consists of three elements: (1) resource provider network formation by classifying nodes into stem or leaf depending on network stability, (2) task allocation based on each node’s resource availability and soliciting status, and (3) task redistribution in preparation for possible network and computation losses. Simulation-driven validation in the iFogSim simulator demonstrates that our work achieves a high task completion rate within a designated deadline, while drastically reducing unnecessary network overhead, by selecting only some effective edge devices as computation delegates via locally networked computation.

Highlights

  • With a recent advance in information technology, sensors and consumer devices have become connected with each other via Internet-of-Things (IoT) to provide intelligent applications based on data driven decisions

  • Cloud computing has been an easy-to-use solution with IoT, the innate centralized approach has critical limitations of high network bandwidth usage and large delay for continuously uploading a large bulk of raw data collected at sensors

  • We propose a computation offloading architecture that embeds two essential components of resource provider network construction and flexible resource provisioning coupled in a framework

Read more

Summary

Introduction

With a recent advance in information technology, sensors and consumer devices have become connected with each other via Internet-of-Things (IoT) to provide intelligent applications based on data driven decisions. IoT devices have too limited resources in terms of computation, storage, and networking capabilities to satisfy the requirement of those applications that need massive data processing. To bridge the gap between utility and constraint on IoT applications, cloud computing [1] is a way to flexibly provide resources in need through powerful virtual servers connected in the network. Cloud computing has been an easy-to-use solution with IoT, the innate centralized approach has critical limitations of high network bandwidth usage and large delay for continuously uploading a large bulk of raw data collected at sensors. A fog network is formed with locally connected edge devices that can perform a certain task computation in a collaborative manner. Due to the volatile wireless connectivity and mobility nature, flexible yet reliable resource provisioning and management is a key challenge for its successful evolution

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.