Abstract
The aim of herein work is to develop an automatized algorithm for detecting, as objectively as possible, the flame front evolution of lean/ultra-lean mixtures ignited by low temperature plasma-based ignition systems. The low luminosity characterizing the latter conditions makes both kernel formation and combustion development difficult to detect accurately. Therefore, to estimate the igniter capability to efficiently ignite the mixture, ever more performing tools are required. The present work proposes a new image analysis technique, based on a dual-exposure fusion algorithm and on Convolutional Neural Networks (CNNs), to process low brightness images captured via high-speed camera on an optical engine. The performance of the proposed algorithm (PA) is compared to the one of a base reference (BR) algorithm used by the same research group for the imaging analysis. The comparison shows the capability of PA to quantify the flame radius of consecutive combustion cycles with lower dispersion if compared to BR and to correctly detect some events considered as misfires or anomalies by BR. Moreover, the proposed method shows greater capability to detect, in advance, the kernel formation with respect to BR, thus allowing a more detailed analysis of the performance of the igniters. A metric quantitative analysis is carried out, as well, to confirm the above-mentioned results. Therefore, PA results to be more suitable for analyzing ultra-lean combustions, heavily investigated to meet the increasingly stringent legislation on the internal combustion engines. Finally, the proposed algorithm allows us to automatically estimate the flame front evolution, regardless of the user’s interpretation of the phenomenon.
Highlights
Over the last few decades, the development and application of both experimental and computational research have enabled in-depth analysis of fundamental physical phenomena occurring in spark-ignition (SI) internal combustion engines (ICEs) [1,2].In the ICE experimental research field, the single-cylinder optical access engine is a well-known and widely used diagnostic technique for investigating the temporal evolution of the flame front [3,4]
The present work proposes a new image analysis technique, based on a dual-exposure fusion algorithm and on Convolutional Neural Networks (CNNs), to process low brightness images captured via high-speed camera on an optical engine
The proposed method is preliminarily validated on a specific combustion event at λ = 1.4, by comparing the proposed algorithm (PA) output with binarized images obtained via human perception and used as Target
Summary
In the ICE experimental research field, the single-cylinder optical access engine is a well-known and widely used diagnostic technique for investigating the temporal evolution of the flame front [3,4]. Have been widely studied in optical engines These systems represent an alternative solution to the traditional spark for facing with future high-efficiency SI engines [5,6,7,8,9,10,11]. Marko et al [13] evaluated the projected flame area on a natural gas fueled engine. They found improvements in EGR tolerance using corona instead of conventional spark. The research group of the Department tolerance using corona instead of conventional spark. The research group of t1h46e Depar ment of Engineering (University of Perugia) found an important extension of the lean sta ble limit, compared to the traditional spark, at different engine operating condition o[1f 4E,n1g5i]naeenrdinugs(iUnngivdeirfsfietyreonftPfeuruelgsi.aT) fhoeunledaanneixmtpeonrstiaonnt eaxctheniesvioendobf ythseulecahndsteavbilceelsim[1it4, ,15] ma cdtbfeouiecfmehfaelnprncoaeolrneoentfgdsffyeuutoctemotltishpva.edetTitdorthraeneedcs.lhisDetinatoehnonteleaeoolcxgbsttipyjeeoancntrstokiiovo,aefansdttahdodcefirhfreffieeiesdrrsvesuenttcdhtmineebngoyogmpbisnouejleenclcuohtttpioadvenferetakvseteiimconrenfgissercs[eloi1odfn4on,du1sric5mtaii]nonamndgtsiafop[uy1noe4bllc,el1cauo5antn]naasbnuenetfmdfceeuprcmusttiiiconivsingeas.ilotonschaanr Dacetteecrtiizoen othf tehecfiarpsat bmiolimtyenotfofaknernigelnfiotremr attoioninciatniabteecrruocbiaulstto cchoamrabctuesritzioenthse, ceasppaebciliiatylly unde olefaann/uigltnriat-elreaton imniitxiattuererocbounsdt ictoiomnbsu[s1ti6o]n. sT,heesploecwiallulymuinndoesritlye,acnh/aurlatrcat-elreiazninmgitxhtuersee extrem ccoonndditiitoionns s[1, 6m].aTkhees liot wdiflufimcuinltotsoityr,echoagrnaicztertihziengcotmhebseusetxiotrnemeveoclountdioitnio(nFsi,gmuraeke1s) iatnd, espe dciiaffillcyu,ltthtoe reeacrolgynfilzaemtheedceovmebloupstmioennetv. oFlourtitohna(tFriegausroen1,)tahnedr,eeqspueirceiamllye,ntthsefeoarralymfloamreepowerfu dtoeovlelcoappmaebnlet. oFf orrecthoagtnriezainsognt,hteheflaremqueirferomnetn,tlsedforouarmreosreeaprocwh egrrfouul ptootol ceaxppalbolreeonfew way rteocoagccnoizminpglitshhe tflhaims teafrrgoentt., led our research group to explore new ways to accomplish this target
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.