Abstract

Maintaining adequate energy in low-powered Internet of Things (IoT) nodes is crucial for the development of several applications like smart homes, autonomous industries, etc. These IoT nodes exploit adaptive duty cycling techniques for the efficient utilization of energy resources. However, such adaptive duty cycling of IoT nodes results in their asynchronous operations thereby inducing energy holes in the network. These energy holes lead to information loss and poor quality of services of IoT networks. In this regard, energy harvesting using Mobile Energy Transmitters (MET) can improve the lifetime of an IoT network. In this work, we are introducing a metric named Age of Charging (AoC) metric to quantify the repetitive charging of power deficit IoT nodes. Energy-efficient scheduling of MET is proposed to minimize the expected average AoC such that the energy harvested by IoT nodes is maximized. In this regard, the optimization problem is first remodeled into a Markov decision process. Subsequently, a deep reinforcement learning algorithm is developed based upon the twin delayed deep deterministic policy gradient scheme for energy-efficient scheduling of MET in asynchronous IoT networks. The simulation results indicate that the proposed algorithm outperforms the conventional Deep Q-networks and soft-actor-critic algorithms. These results motivate the usage of MET-aided energy harvesting in self-sustaining IoT networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call