Abstract

Deep reinforcement learning (DRL) is now a research focus for the energy management of fuel cell vehicles (FCVs) to improve hydrogen utilization efficiency. However, since DRL-based energy management strategies (EMSs) need to be retrained when the types of FCVs are changed, it is a laborious task to develop DRL-based EMSs for different FCVs. Given that, this article introduces transfer learning (TL) into DRL to design a novel deep transfer reinforcement learning (DTRL) method and then innovatively proposes an intelligent transferable energy management framework between two different urban FCVs based on the designed DTRL method to achieve the reuse of well-trained EMSs. To begin, an enhanced soft actor-critic (SAC) algorithm integrating prioritized experience replay (PER) is formulated to be the studied DRL algorithm in this article. Then, an enhanced-SAC based EMS of a light fuel cell hybrid electric vehicle (FCHEV) is pre-trained by using massive real-world driving data. After that, the learned knowledge stored in the FCHEV's well-trained EMS is captured and then transferred into the EMS of a heavy-duty fuel cell hybrid electric bus (FCHEB). Finally, the FCHEB's EMS is fine-tuned in a stochastic environment to ensure adaptability to real driving conditions. Simulation results indicate that, compared to the state-of-the-art baseline EMS, the proposed DTRL-based EMS accelerates the convergence speed by 91.55% and improves the fuel economy by 6.78%. This article contributes to shortening the development cycle of DRL-based EMSs and improving the utilization efficiency of hydrogen energy in the urban transport sector.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call