Abstract

The fast charging of lithium-ion batteries while minimizing battery degradation is a key challenge to battery community. Difficulties in this optimization are the high dimensionality of parameter space of charging strategies, significant variability between batteries (even from the same production line), and limited quantitative information on battery degradation mechanisms. Current approaches to addressing these challenges are model-based optimization and grid search. Model-based methods are limited by the insufficient complexity and accuracy of electrochemical models – especially in the early stage of development when a new battery chemistry is being introduced to the market – and grid search methods are expensive in terms of testing time and cells. This article proposes a data-driven Bayesian optimization (BO) approach for minimum charging time problem, in which an acquisition function of constrained expected improvement is employed to explicitly handle constraints that limit degradation. In addition, continuous-varied-current charging protocols are introduced into the proposed BO approach by utilizing the technique of polynomial function expansions. The effectiveness of the proposed approach is demonstrated on the LIONSIMBA, a porous electrode theory-based battery simulator. The simulation results show that the proposed BO-based charging approach with continuous current profile outperforms the commonly used constant current constant voltage (CC-CV) method for the minimum charging time problem. Moreover, the decrease in the minimum charging time and increase in its variance with increasing number of degrees of freedom used in charging protocols is also quantified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call