Abstract

Recently, Acritical Intelligent (AI) methodologies such as Long and Short-term Memory (LSTM) have been widely considered promising tools for engine performance calibration, especially for engine emission performance prediction and optimization, and Transformer is also gradually applied to sequence prediction. To carry out high-precision engine control and calibration, predicting long time step emission sequences is required. However, LSTM has the problem of gradient disappearance on too long input and output sequences, and Transformer cannot reflect the dynamic features of historic emission information which derives from cycle-by-cycle engine combustion events, which leads to low accuracy and weak algorithm adaptability due to the inherent limitations of the encoder-decoder structure. In this paper, considering the highly nonlinear relation between the multi-dimensional engine operating parameters the engine emission data outputs, an Embedding-Graph-Neural-Network (EGNN) model was developed combined with self-attention mechanism for the adaptive graph generation part of the GNN to capture the relationship between the sequences, improve the ability of predicting long time step sequences, and reduce the number of parameters to simplify network structure. Then, a sensor embedding method was adopted to make the model adapt to the data characteristics of different sensors, so as to reduce the impact of experimental hardware on prediction accuracy. The experimental results show that under the condition of long-time step forecasting, the prediction error of our model decreased by 31.04% on average compared with five other baseline models, which demonstrates the EGNN model can potentially be used in future engine calibration procedures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call