Abstract

Recent years have seen a significant increase in the adoption of electric vehicles, and investments in electric vehicle charging infrastructure and rooftop photo-voltaic installations. The ability to delay electric vehicle charging provides inherent flexibility that can be used to compensate for the intermittency of photo-voltaic generation and optimize against fluctuating electricity prices. Exploiting this flexibility, however, requires smart control algorithms capable of handling uncertainties from photo-voltaic generation, electric vehicle energy demand and user’s behaviour. This paper proposes a control framework combining the advantages of reinforcement learning and rule-based control to coordinate the charging of a fleet of electric vehicles in an office building. The control objective is to maximize self-consumption of locally generated electricity and consequently, minimize the electricity cost of electric vehicle charging. The performance of the proposed framework is evaluated on a real-world data set from EnergyVille, a Belgian research institute. Simulation results show that the proposed control framework achieves a 62.5% electricity cost reduction compared to a business-as-usual or passive charging strategy. In addition, only a 5% performance gap is achieved in comparison to a theoretical near-optimal strategy that assumes perfect knowledge on the required energy and user behaviour of each electric vehicle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call