Abstract

Mobile charging provides a new energy replenishment technology for Wireless Rechargeable Sensor Network (WRSN), where the Mobile Charger (MC) is employed for charging nodes sequentially according to the mobile charging scheduling result, using node charging timeliness and quality of sensing coverage as the scheduling criteria. Sensing coverage is a critical network property and has received more interest in recent research studies in mobile charging scheduling in WRSN. As the network environment is usually uncertain and the charging demands may change dynamically from time to time, online mobile charging scheduling is crucial, but existing online approaches are mostly based on specific network models, which are difficult to obtain in practical applications. In this paper, we propose a novel model-free deep reinforcement learning algorithm for the Online Mobile Charging Scheduling with optimal Quality of Sensing Coverage (OMCS-QSC) problem in WRSN, Multistage Exploration Deep Q-Network (MEDQN), where MC is designed as an agent to explore the online charging schedules via a new multistage exploration strategy for maximizing the network QSC according to the real-time network state. In addition, we also design a novel reward function to evaluate the MC charging action via the real-time sensing coverage contributions of the nodes. Extensive simulations show that MEDQN can reach the convergence state stably and is superior to existing online algorithms, especially in large-scale WRSNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call