Abstract

In Internet of Things (IoT)-based data collection systems, sensor nodes (SNs) periodically transmit data to the server through wireless gateway (GW). However, if SNs are located far from the GW, they consume lots of energy to transmit data. In this paper, we design unmanned aerial vehicle (UAV)-based wireless power transfer and data collection system. In the proposed system, UAV patrols the target area and SNs periodically transmit the sensed data to GW or UAV by considering the distance to GW and UAV. To collect sufficient number of fresh data steadily by avoiding the situation of energy depletion of SNs and UAV, UAV periodically makes three types of decisions: 1) decision on where to move; 2) decision whether to transmit or aggregate data from SNs; and 3) decision whether to transfer energy to SNs. For the optimal decisions of UAV, we propose a deep reinforcement learning (DRL)-based UAV operation decision algorithm (DeepUAV). In DeepUAV, the controller continuously learns online and improves the three types of decisions of UAV through a trial-and-error. Evaluation results demonstrate that DeepUAV outperforms comparison schemes in terms of the average number of collected fresh data at each epoch.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call