Abstract

Advances in communication technologies have made the interaction of small devices, such as smartphones, wearables, and sensors, scattered on the Internet, bringing a whole new set of complex applications with ever greater task processing needs. These Internet of things (IoT) devices run on batteries with strict energy restrictions. They tend to offload task processing to remote servers, usually to cloud computing (CC) in datacenters geographically located away from the IoT device. In such a context, this work proposes a dynamic cost model to minimize energy consumption and task processing time for IoT scenarios in mobile edge computing environments. Our approach allows for a detailed cost model, with an algorithm called TEMS that considers energy, time consumed during processing, the cost of data transmission, and energy in idle devices. The task scheduling chooses among cloud or mobile edge computing (MEC) server or local IoT devices to achieve better execution time with lower cost. The simulated environment evaluation saved up to 51.6% energy consumption and improved task completion time up to 86.6%.

Highlights

  • IntroductionAn International Data Corporation (IDC) report predicts that there will be 41.6 billion

  • An International Data Corporation (IDC) report predicts that there will be 41.6 billionInternet of things (IoT) devices in 2025 with a potential for data generation up to 79.4 ZB [1]

  • The tested scenarios evaluated the size of data entry and results, task generation rate, deadline of critical tasks, level of batteries for IoT devices, and use of dynamic voltage and frequency scaling (DVFS)

Read more

Summary

Introduction

An International Data Corporation (IDC) report predicts that there will be 41.6 billion. IoT applications emerged with artificial intelligence, artificial vision, and object tracking in such a context that requires high computing power [2,3] They usually rely on task processing offload and data storage to remote cloud computing (CC) data centers to boost processing time and reduce battery energy consumption [4]. Those remote servers are geographically located away from the end user and IoT devices, resulting in high latency due to delay and congestion over the communication channels [5,6,7].

Related Work
Problem Statement
Architecture and Task Processing Flow
Network Model
General Energy Consumption
Local Computing in the IoT Device
Local Computing in the MEC Server
Remote Computing in the Cloud
Model Constraints for IoT Device Battery
The TEMS Algorithm
Result
Simulated Hardware and Software Stack
Experiments and Results
Use of MEC Servers
IoT Device Battery Energy Consumption
Accuracy Evaluation of Energy Model
Variation of Input Data Size
Impact of Energy and Time Coefficients in the Schedule Policy Choices
Impact of Task Generation Rate
Using the DVFS Technique
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call