Abstract

Breakthroughs in Wireless Energy Transfer (WET) technologies have revitalized Wireless Rechargeable Sensor Networks (WRSNs). However, how to schedule mobile chargers rationally has been quite a tricky problem. Most of the current work does not consider the variability of scenarios and how many mobile chargers should be scheduled as the most appropriate for each dispatch. At the same time, the focus of most work on the mobile charger scheduling problem has always been on reducing the number of dead nodes, and the most critical metric of network performance, packet arrival rate, is relatively neglected. In this paper, we develop a DRL-based Partial Charging (DPC) algorithm. Based on the number and urgency of charging requests, we classify charging requests into four scenarios. And for each scenario, we design a corresponding request allocation algorithm. Then, a Deep Reinforcement Learning (DRL) algorithm is employed to train a decision model using environmental information to select which request allocation algorithm is optimal for the current scenario. After the allocation of charging requests is confirmed, to improve the Quality of Service (QoS), i.e., the packet arrival rate of the entire network, a partial charging scheduling algorithm is designed to maximize the total charging duration of nodes in the ideal state while ensuring that all charging requests are completed. In addition, we analyze the traffic information of the nodes and use the Analytic Hierarchy Process (AHP) to determine the importance of the nodes to compensate for the inaccurate estimation of the node’s remaining lifetime in realistic scenarios. Simulation results show that our proposed algorithm outperforms the existing algorithms regarding the number of alive nodes and packet arrival rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call