An essential aspect of battery research and development involves exploring various cell parameters to deliver optimal performance. Battery optimization is challenging due to the huge cost and time required to evaluate different configurations in experiments or simulations. Optimizing the cycling performance is especially costly since battery cycling is extremely time consuming. Adding to this challenge is the expansive parameter space. In this talk, I will present some of our recent works that leverage machine learning and physics-informed machine learning for cell optimization. I will first show an unsupervised machine learning approach that can reduce the battery testing time by about 75% [1]. This framework comprises two main components: a pruner and a sampler. The pruner employs the Asynchronous Successive Halving Algorithm and Hyperband to halt the cycling of unpromising batteries, thereby conserving resources for further exploration. Meanwhile, the sampler, utilizing Tree of Parzen Estimators, predicts promising configurations for future cycles based on query history. Notably, our framework accommodates categorical, discrete, and continuous parameters and can operate asynchronously in parallel to facilitate multiple simultaneous cycling cells. Next, I will show a method to embed physical laws and on-line observation into machine learning [2], so that irrelevant low-cost battery data can be utilized to identify battery parameters by machine learning without knowledge of their ground truth as the training data. We take diffusivity as an example and show that it can be obtained from easily measured sequence of battery voltage over time. Our results show that this method accurately quantifies not only the diffusivities of both positive and negative electrodes, but also as complex non-linear functions of lithium concentration, purely based on the cell voltage data requiring neither diffusivity nor concentration measurement. Notably, it can accurately predict non-monotonic, many-to-one relations such as “w” shape functions. Moreover, this method is immune to measurement noise and capable of simultaneously estimating multiple parameters. Finally, I will report a new approach of Self-directed Online Learning Optimization (SOLO) [3] for cell optimization, which integrates Deep Neural Network (DNN) with Finite Element electrochemical calculations. A DNN learns and substitutes the objective as a function of design variables. A small number of training data is generated dynamically based on the DNN’s prediction of the optimum. The DNN adapts to the new training data and gives better prediction in the region of interest until convergence. The optimum predicted by the DNN is proved to converge to the true global optimum through iterations. Our algorithm reduced the computational time by up to 5 orders of magnitude compared with directly using heuristic methods, and outperformed all state-of-the-art algorithms tested in our experiments. Broadly, we would like to emphasize that integrating knowledge into machine learning [4] can lead to significant advances in cell design. REFERENCES:[1] C. Deng, A. Kim, and W. Lu, “A generic battery-cycling optimization framework with learned sampling and early stopping strategies,” Patterns, 3, 100531 (2022). https://doi.org/10.1016/j.patter.2022.100531[2] B. Wu, B. Zhang, C. Deng, and W. Lu, “Physics-encoded deep learning in identifying battery parameters without direct knowledge of ground truth,” Applied Energy, 321, 119390 (2022). https://doi.org/10.1016/j.apenergy.2022.119390[3] C. Deng, Y. Wang, C. Qin, Y. Fu, and W. Lu, “Self-directed online machine learning for topology optimization,” Nature Communications, 13, 388 (2022). https://doi.org/10.1038/s41467-021-27713-7[4] C. Deng, X. Ji, C. Rainey, J. Zhang, and W. Lu, “Integrating machine learning with human knowledge,” iScience, 23, 101656 (2020). https://doi.org/10.1016/j.isci.2020.101656
Read full abstract