Abstract

<div>Deep reinforcement learning has been utilized in different areas with significant progress, such as robotics, games, and autonomous vehicles. However, the optimal result from deep reinforcement learning is based on multiple sufficient training processes, which are time-consuming and hard to be applied in real-time vehicle energy management. This study aims to use expert knowledge to warm start the deep reinforcement learning for the energy management of a hybrid electric vehicle, thus reducing the learning time. In this study, expert domain knowledge is directly encoded to a set of rules, which can be represented by a decision tree. The agent can quickly start learning effective policies after initialization by directly transferring the logical rules from the decision tree into neural network weights and biases. The results show that the expert knowledge-based warm start agent has a higher initial learning reward in the training process than the cold start. With more expert knowledge, the warm start shows improved performance in the initial learning stage compared to the warm start method with less expert knowledge. The results indicate that the proposed warm start method requires 76.7% less time to achieve convergence than the cold start. The proposed warm start method is also compared with the conventional rule-based method and equivalent consumption minimization strategy. The proposed warm start method reduces energy consumption by 8.62% and 3.62% compared with the two baseline methods, respectively. The results of this work can facilitate the expert knowledge-based deep reinforcement learning warm start in hybrid electric vehicle energy management problems.</div>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call