Abstract

One of the factors hindering the scaled application of lithium-ion batteries is their thermal safety issues. Real-time monitoring of core temperature holds promise in alleviating this concern. However, calibrating individual models for thousands of battery cells poses challenges and leads to big data issues. The maturing and rapidly advancing artificial intelligence technologies hold promise in addressing the task of core temperature prediction. In this study, we conduct a benchmark test using typical recurrent neural networks for core temperature prediction. The test encompasses three main aspects: data acquisition, data processing, and data analysis. Firstly, we conduct dynamic battery tests across a wide temperature range on a custom-built experimental platform. Subsequently, we employ big data techniques for data pre-processing, including filtering, normalization, and sliding window methods. Next, hyperparameter optimization is then performed using the Bayesian optimization with Tree-structured Parzen Estimator, followed by K-fold cross validation. Consequently, we obtain three optimal performing models based on typical recurrent neural networks. Finally, we extensively compare these models in terms of performance and complexity. According to the results, all data-driven methods achieve satisfactory model performance, with RMSE within 0.17 °C and MAE within 0.15 °C. Among the models with similar predictive performance conditions, gated recurrent unit exhibits the lowest model complexity and faster training speed. Meanwhile, we also validate the portability and generalizability of the forecasting models. This study lays the foundation for future research in this area and can provide guidance for engineering practical applications to ensure the safe operation of battery energy storage systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call