Abstract

Solar-powered unmanned aerial vehicles (SUAVs) are a promising solution to increase the flight time of unmanned aerial vehicles (UAVs) in the sky, reducing human interventions for battery charging. Exploiting networked SUAVs for providing long-duration wireless communication cannot only improve the signal transmission reliability but realize energy autonomy. To reap these benefits, in this article, we propose an efficient energy and radio resource management framework based on intelligent power cognition at the SUAVs. Thereby, power-cognitive SUAVs can learn the environment including the spatial distributions of solar energy density, the channel state evolution, and the traffic patterns of wireless communication applications in adaption to the environment changes. These SUAVs intelligently adjust the energy harvesting, information transmission, and flight trajectory to improve the utilization of solar energy for two primary goals: staying aloft over a long time period and achieving high communication performance. We adopt reinforcement learning to compute the optimal decisions for maximization of the total system throughput within the lifetime of the SUAV. Simulation results show that the proposed power cognition scheme can simultaneously improve the communication throughput and the harvested energy for SUAVs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call