Abstract
The unmanned aerial vehicle (UAV) can act as the edge server in delay-sensitive monitoring for data collection and processing in the Internet of things (IoT) networks due to its flexibility and low operational cost. One of its major disadvantages is the limited battery level. This paper focuses on a problem with the rechargeable UAV-assisted energy-efficient and fresh data collection in the IoT networks. In particular, the UAV takes off from the initial position to collect data packets from sensor nodes (SNs) in the IoT networks and needs to reach the final position at a given time. Some charging stations (CSs) are in the IoT networks, which can recharge the UAV by the wireless power transfer technique to keep the UAV’s energy level from falling below the threshold energy. To minimize the weighted sum of the average age of information (AoI) and the average recharging price, we design a Markov Decision Process (MDP) to determine the UAV’s flight trajectory, the scheduling of SNs, and energy recharging. The MDP is then solved using a rechargeable UAV-assisted data collection algorithm based on dueling double deep Q-networks (D3QN). Numerous simulations show that the proposed D3QN algorithm can reduce the weighted sum of the average AoI and the average recharging price more effectively than the baseline algorithms.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.