Abstract

—Machine learning technologies have gained significant popularity in rechargeable battery research in recent years, and have been extensively adopted to construct data-driven solutions to tackle multiple challenges for energy storage in embedded computing systems. An important application in this area is the machine learning-based battery lifetime prediction, which formulates regression models to estimate the remaining lifetimes of batteries given the measurement data collected from the testing process. Due to the non-idealities in practical operations, these measurements are usually impacted by various types of interference, thereby involving noise on both input variables and regression labels. Therefore, existing works that focus solely on minimizing the regression error on the labels cannot adequately adapt to the practical scenarios with noisy variables. To address this issue, this study adopts total least squares (TLS) to construct a regression model that achieves superior regression accuracy by simultaneously optimizing the estimation of both variables and labels. Furthermore, due to the expensive cost for collecting battery cycling data, the number of labeled data samples used for predictive modeling is often limited. It, in turn, can easily lead to overfitting, especially for TLS, which has a relatively larger set of problem unknowns to solve. To tackle this difficulty, the TLS method is investigated conjoined with stepwise feature selection in this work. Our numerical experiments based on public datasets for commercial Lithium-Ion batteries demonstrate that the proposed method can effectively reduce the modeling error by up to 11.95 %, compared against the classic baselines with consideration of noisy measurements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call