Abstract

An energy management strategy (EMS) based on reinforcement learning is proposed in this study to enhance the fuel economy and durability of a fuel cell hybrid bus (FCHB). Firstly, a comprehensive powertrain system model for the FCHB is established, mainly including the FCHB’s power balance, fuel cell system (FCS) efficiency, and aging models. Secondly, the state–action space, state transition probability matrix (TPM), and multi-objective reward function of Q-learning algorithm are designed to improve the fuel economy and the durability of power sources. The FCHB’s demand power and battery state of charge (SOC) serve as the state variables and the FCS output power is used as the action variable. Using the demonstration FCHB data, a state TPM is created to represent the overall operation. Finally, an EMS employing Q-learning is formulated to optimize the fuel economy of FCHB, maintain battery SOC, suppress FCS power fluctuations, and enhance FCS lifetime. The proposed EMS is tested and verified through hardware-in-the-loop (HIL) tests. The simulation results demonstrate the effectiveness of the proposed strategy. Compared to a rule-based EMS, the Q-learning-based EMS can improve the energy economy by 7.8%. Furthermore, it is only a 3.7% difference to the best energy economy under dynamic optimization, while effectively reducing the decline and enhancing the durability of the FCS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call