Abstract
Estimating the remaining useful life (RUL) of proton exchange membrane fuel cells (PEMFCs) is key to predicting and increasing their durability. This paper investigates six deep learning techniques – long short-term memory (LSTM) networks, gated recurrent units (GRU), 1-D convolutional neural network (CNN)-LSTMs, 1D–CNN–GRUs, 1D–CNN–Bidirectional-LSTMs, and 1D–CNN–Bidirectional-GRUs – for RUL and long-term degradation trend prediction for a five-cell PEMFC stack operated with and without current ripples. The hyperparameters of all models are optimized for minimum root-mean-square error (RMSE) of the stack voltage using tree-structured Parzen estimators (TPE) with hyperband pruning. The obtained prediction results are benchmarked using the coefficient of determination (R2), RMSE, and mean absolute percentage error (MAPE) in stack voltage predictions and relative error (RE) in RUL estimates. The comparative analysis for models trained on 50% datasets reveals that the best R2, RMSE, MAPE, and RE of 98.87E-2, 1.258E-3, 0.285E-1, and 0.0425E-1 are offered by the LSTM model for the stack without current ripples. Further, for the stack with current ripples, the best R2, RMSE, MAPE, and RE of 99.482E-2, 1.77E-3, 0.444E-1, and 0.089E-1 are exhibited by the 1D–CNN–Bidirectional-GRU model. The LSTM model also reveals excellent transfer learning performance on the stack with current ripples with the best R2, RMSE, MAPE, and RE of 98.91E-2, 2.455E-3, 0.362E-1, and 0.044E-1, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.