Abstract

The Internet-of-Things (IoT) edge allows cloud computing services for topology and location-sensitive distributed computing. As an immediate benefit, it improves network reliability and latency by enabling data access and processing rapidly and efficiently near IoT devices. However, it comes with several issues stemming from the complexity, the security, the energy consumption, and the instability due to the decentralization of service localization. Furthermore, the multi-resource allocation and task scheduling make this task the furthest from being straightforward. Blockchain has been envisioned to enforce trustworthiness in diverse IoT environments. However, high latency and high energy costs are incurred to process IoT transactions. This paper introduces a novel Blockchain-based Deep Reinforcement Learning (DRL) approach to enable energy-aware task scheduling and offloading in an Software Defined Networking (SDN)-enabled IoT network. The Asynchronous Actor–Critic Agent (A3C) DRL-based policy achieves efficient task scheduling and offloading. The latter is in symbiosis with Proof-of-Authority Blockchain consensus to validate IoT transactions and blocks. By doing so, we improve reliability and low latency and achieve energy efficiency for SDN-enabled IoT networks. The A3C policy combined with the Blockchain is proved theoretically. Carried out experiments put forth that our approach offers 50% better energy efficiency, which outperforms traditional consensus algorithms, i.e., Proof of Work and PBFT, in terms of throughput and network latency and offers better scheduling performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call