Abstract

Fuel cell hybrid electric vehicles (FCHEV) are helping to advance the cause of environmental protection as a sustainable form of transportation. An effective energy management strategy (EMS) is crucial to reduce the usage cost of FCHEV and enhance SOC maintenance ability. This study establishes separate degradation models for fuel cells and lithium batteries, incorporating the decay factor of energy sources’ lifetime into the EMS. To address the sparse reward problem during training, a novel energy management strategy algorithm based on deep reinforcement learning is proposed, which combines the twin delayed deep deterministic policy gradient (TD3) algorithm framework with learning rate annealing (AL) and hindsight prioritized experience replay (HPER) optimization methods, resulting in strong training performance. Experimental results demonstrate significant advantages of the EMS based on the HPER_AL_TD3 algorithm over traditional TD3-based approaches. The proposed EMS exhibits superior adaptability to various driving cycles, ensuring stable SOC levels and reducing the overall usage cost. This research aims to enhance the learning capability of EMS based on deep reinforcement learning and contribute to the promotion of FCHEV.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call