Abstract

Unmanned aerial vehicles (UAVs) nowadays are developing rapidly for various applications such as UAV taxis and delivery drones. However, the limited battery energy restricts the flight distance of the UAVs. Thus, urban prosumers equipped with drone recharge stations are introduced to provide charging services for the UAVs. In this article, firstly, a day-ahead energy scheduling problem for UAV charging-enabled urban prosumers is studied, where the objective is to maximize the overall energy satisfaction of the prosumers with ensuring the quality of service (QoS) of the charged UAVs. Specifically, to deal with the considered problem, we decompose it into two stages: 1) day-ahead energy requirement data prediction stage, and 2) energy scheduling stage per prosumer. Thus, secondly, a joint method based on hierarchical federated learning (HFL) on long short-term memory (LSTM) architecture (HFL-LSTM) and stochastic game-based multi-agent double deep Q-learning (MADDQN) with community agent-independent approach is proposed. In particular, the HFL-LSTM approach is leveraged to forecast each prosumer’s energy requirement data without centralized collecting local prosumers’ data such that to protect data privacy. Then, the stochastic game is adopted to analyze the formulated problem, aiming to find the Nash equilibrium strategy. Afterward, MADDQN with a community agent-independent method is utilized to achieve the best energy scheduling strategy per prosumer. Finally, the experimental results demonstrate the superiority of the proposed joint method that can achieve the lowest mean squared error with the value of 0.0152 and the highest energy satisfaction (36388) achieved by the Nash equilibrium policy compared with the benchmarks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call