Abstract

Billions of Internet of Things (IoT) devices, e.g., sensors and RFIDs, are arising around us providing not only computing-intensive, but also delay-sensitive services, ranging from augmented/virtual realities to distributed data analysis and artificial intelligence. Notably, the low response latency for IoT services is achieved at the cost of computing complexity that far exceeds the capabilities of IoT devices. To feed this trend, multiple computing paradigms are emerging, such as mobile transparent computing (TC), edge computing, and fog computing. These paradigms employ more resourceful edge devices, e.g., small-scale servers, smart phones, and laptops, to assist the low-end IoT devices. By offloading the computing-intensive tasks to the edge devices, it is expected to converge the data collection at IoT devices and the data processing at edge devices to provision computing-intensive and delay-sensitive services. However, many issues remain in the application of computing offloading which impede its flourishing in IoTs. To name a few, what are the killer APPs that need computing offloading for performance boost? How to partition an encapsulated APP into offloadable code blocks for remote loading? How to determine which code blocks or computing tasks should be offloaded to edge servers? How to customize the communication protocol to guarantee the coherence of computation offloading?

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.