Abstract

As a core part of hybrid electric vehicles (HEVs), energy management strategy (EMS) directly affects the vehicle fuel‐saving performance by regulating energy flow between engine and battery. Currently, most studies on EMS are focused on buses or commuter private cars, whose driving cycles are relatively fixed. However, there is also a great demand for the EMS that adapts to variable driving cycles. The rise of machine learning, especially deep learning and reinforcement learning, provides a new opportunity for the design of EMS for HEVs. Motivated by this issue, herein, a double‐deep Q‐network (DDQN)‐based EMS for HEVs under variable driving cycles is proposed. The distance traveled of the driving cycle is creatively introduced as states into the DDQN‐based EMS of HEV. The relevant problem of “curse of dimensionality” caused by choosing too many states in the process of training is solved via the good generalization of deep neural network. For the problem of overestimation in model training, two different neural networks are designed for action selection and target value calculation, respectively. The effectiveness and adaptability to variable driving cycles of the proposed DDQN‐based EMS are verified by simulation comparison with Q‐learning‐based EMS and rule‐based EMS for improving fuel economy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call