<div class="section abstract"><div class="htmlview paragraph">This paper presents an experimental investigation of the impact of EGR dilution on the tradeoff between flame and end-gas autoignition heat release in a Spark-Assisted Compression Ignition (SACI) combustion engine. The mixture was maintained stoichiometric and fuel-to-charge equivalence ratio (<i>ϕ</i><sup>′</sup>) was controlled by varying the EGR dilution level at constant engine speed. Under all conditions investigated, end-gas autoignition timing was maintained constant by modulating the mixture temperature and spark timing. Experiments at constant intake pressure and constant spark timing showed that as <i>ϕ</i><sup>′</sup> is increased, lower mixture temperatures are required to match end-gas autoignition timing. Higher <i>ϕ</i><sup>′</sup> mixtures exhibited faster initial flame burn rates, which were attributed to the higher laminar flame speeds immediately after spark timing and their effect on the overall turbulent burning velocity. The increasing trends in peak heat release rate and peak autoignition rate as a function of <i>ϕ</i><sup>′</sup> were consistent for all intake pressures ranging from 80 kPa to 150 kPa. For a constant spark timing, the mass fraction burned at the onset of autoignition correlated well with <i>ϕ</i><sup>′</sup>, regardless of intake pressure. Reducing the unburned mass fraction at the onset of autoignition was effective at limiting the peak heat release rates. This behavior was attributed to the reduced energy content of the end-gas and also to the reduced end-gas burn rates. Possible explanations for the latter observation were the greater buffer gas effect due to a larger burned gas mass, and the higher reactivity stratification of the end-gas being closer to the walls.</div></div>
Read full abstract