Abstract

Future vehicle-to-grid (V2G) systems require more flexible scheduling to adjust and flatten the peak energy demand. For efficient scheduling and energy trading, the utility provider (UP) needs to keep track of the state of charge (SoC) of vehicle batteries (VBs). However, sharing of SoC of VBs from electric vehicles (EVs) to UP may compromise owner privacy by analyzing the electricity usage in EVs. Therefore, we propose Reinforcement learning (RL)-based demand-side energy management using a rechargeable battery (RB) for enhanced cost-friendly privacy of EVs, efficient scheduling, and accurate billing. With existing Q-Learning-based RL (using <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\epsilon $ </tex-math></inline-formula> -greedy exploration and exploitation), we find that the reward maximization of efficient and private scheduling is often sluggish and incurs convergence issues. Therefore, we develop a genetic algorithm (GA)-based exploration and exploitation, which solves the convergence problems. We develop theoretical analysis and implement numerical results to demonstrate that the proposed GA-based RL framework accelerates convergence and enhances cost-friendly privacy considerably.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call