Abstract

The development of hydrogen-based vehicles (HVs) can help achieve a zero-carbon future; however, the availability of hydrogen refueling stations (HRSs) prevents their widespread adoption as personal vehicles. Existing studies have investigated the energy management of HRSs using various methods. However, there have been no reports of implementing a deep reinforcement learning (DRL) approach to address these uncertainties and achieve real-time decision making. This study proposes an energy management optimization model of an on-grid HRS based on the improved dueling double deep Q network(D3QN) algorithm with NoisyNet. The primary goal is to reduce the cost of operating an HRS and improve voltage stability while satisfying the hydrogen demand of HVs. Notably, this study adopts an improved version of the double deep Q network (DDQN), that is, the NoisyNet-D3QN (NN-D3QN) approach, because NoisyNet can aid efficient exploration and the dueling network can generalize learning across actions. The adopted NN-D3QN algorithm has better performance than other basic algorithms. Compared with the NN-DDQN, D3QN, and DDQN approaches, the reward of the proposed method increases by 19.08 %, 31.66 %, and 39.26 %, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.