Abstract

The growing demand and the diverse traffic patterns coming from various heterogeneous Internet of Things (IoT) systems place an increasing strain on the IoT infrastructure at the network edge. Different edge resources (e.g. servers, routers, controllers, gateways) may illustrate different execution times and energy consumption for the same task. They should be capable of achieving high levels of performance to cope with the variability of task handling. However, edge nodes are often faced with issues to perform optimal resource distribution and energy-awareness policies in a way that makes effective run-time trade-offs to balance response time constraints, model fidelity, inference accuracy, and task schedulability. To address these challenging issues, in this paper we present a SDN-based dynamic task scheduling and resource management Deep Reinforcement Learning (DRL) approach for IoT traffic scheduling at the network edge. First, we introduce the architectural design of our solution, with the specific objective of achieving high network performance. We formulate a task assignment and scheduling problem that strives to minimize the network latency while ensuring energy efficiency. The evaluation of our approach offers better results compared against both deterministic and random task scheduling approaches, and shows significant performances in terms of latency and energy consumption.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call