Abstract

In Unmanned Aerial Vehicle (UAV)-assisted Wireless Powered Internet of Things (IoT), the UAV is employed to charge the IoT nodes remotely via Wireless Power Transfer (WPT) and collect their data. A key challenge of resource management for WPT and data collection is preventing battery drainage and butter overflow of the ground IoT nodes in the presence of highly dynamic airborne channels. In this paper, we consider the resource management problem in practical scenarios, where the UAV has no a-prior information on battery levels and data queue lengths of the nodes. We formulate the resource management of UAV-assisted WPT and data collection as Markov Decision Process (MDP), where the states consist of battery levels and data queue lengths of the IoT nodes, channel qualities, and positions of the UAV. A deep Q-learning based resource management is proposed to minimize the overall data packet loss of the IoT nodes, by optimally deciding the IoT node for data collection and power transfer, and the associated modulation scheme of the IoT node.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call