Abstract

The service life and fuel consumption of fuel cell system (FCS) are the main factors limiting the commercialization of fuel cell electric vehicles (FCEV). Effective energy management strategies (EMS) can reduce fuel consumption during the cycle and prolong the service life of FCS. This paper proposes an energy management strategy based on the deep reinforcement learning (DRL) algorithm, deep Q-learning (DQL). Considering the unstable performance of conventional DQL during the training process, a new algorithm called Double Deep Q Learning (DDQL) is introduced. The DDQL uses a target evaluation network to evaluate output actions and a delayed update strategy to improve the convergence and stability of DRL. This article trains the strategy using UDDS cycle, tests it using combined cycles UDDS-WLTC-NEDC, and compares it with traditional ECM-based EMS. The results demonstrate that under the combined cycle, the strategy effectively reduced FCS voltage degradation by 50%, maintained fuel economy, and ensured consistency between the initial and final state of charge (SOC) of LIB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call