Abstract

—This paper proposes a real-time energy management strategy (EMS) for hybrid electric vehicles by incorporating reinforcement learning (RL) in a model predictive control (MPC) framework, which avoids the inherent drawbacks of RL—the excessive learning time and lack of adaptability—and remarkably enhances the real-time performance of MPC. First, the MPC framework for the energy management problem is formulated. In that, a novel long short-term memory (LSTM) neural network is utilized to construct the velocity predictor for a more accurate prediction, and its prediction capability is verified by a comparative analysis. Then, the HEV prediction model and the velocity predictor are regarded as the RL model with which the RL agent can interact. On this basis, the optimal control sequence in the prediction horizon can be learned through model-based RL, but only the first element is actually executed, and the RL process begins anew after the prediction horizon moves forward. In the simulation, the algorithm's convergence is analyzed and the influence of the prediction horizon length is evaluated. Then, the proposed EMS is compared with DP, conventional MPC, and RL method, the results of which demonstrate its performance and adaptability. As last, a hardware-in-the-loop test validates its actual applicability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.