Abstract

Mobile charging is a practical solution to overcome energy constraints in wireless rechargeable sensor networks (WRSNs). It utilizes a mobile charger (MC) to charge sensors based on the wireless power transfer technique via a charging scheduling scheme. However, how to schedule MC to achieve higher charging efficiency with less mobile charging cost, while ensuring network functionality remains a challenge. Specifically, the mobile charging cost is the total energy consumed by the MC during the charging task. This paper studies the charging sequence scheduling problem with optimal mobile charging cost and charging efficiency (CSCE), considering dynamic changes in sensor energy consumption. Moreover, we model the CSCE as a multi-objective mixed integer nonlinear programming. To address this issue, we propose an improved deep Q network approach for CSCE (IDQN-CSCE), where the MC acts as an agent to explore the WRSN and determine a charging policy based on the charging demand of sensors. IDQN-CSCE employs Q-learning and deep Q-networks (DQN) to train the Q-table and DQN network simultaneously. Charging actions are selected based on probability, and the probability of using Q-learning decreases as iteration steps increase. Additionally, we design a novel reward function consisting of a weighted sum of rewards and penalties fed back by the environment after performing a charging action. Simulation results show that IDQN-CSCE achieves higher charging efficiency than other baseline approaches with less mobile charging cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call