Abstract

In exploring pattern recognition for electronic noses via deep neural networks, traditional networks encounter key challenges, such as low training efficiency, and neglect of spatial-temporal attributes of gas sensor response sequences. In this study, an unmanned gas-sensing test system is used to generate a large dataset to ensure robust model training. The Gramian angular field-Markov transition field is utilized to convert time sequences into images. Using advanced image processing tools, the images are then subsequently compressed with data augmentation. This method fully preserves temporal and spatial features within the sequences, thus enhancing model performance. Furthermore, the proposed multi-task learning (MTL) framework competently performs simultaneous classification and regression tasks. The primary component of the MTL network, majorly constituted of convolutional neural networks, emphasizes the spatial features of the sequences. The integration of a long short-term memory layer ensures the preservation of temporal feature analysis of the input data, thereby enhancing predictive performance. When images are compressed to only 3.9 % of the original data, substantial information can still be preserved. Subsequently, the model trained by such compressed images attains an accuracy of 95.31 % and an R2 score of 0.9510 for classification and regression tasks, respectively. Our work reveals the remarkable potential of integrating temporal and spatial features in pattern recognition, promoting the potency of multi-tasking deep learning networks in electronic nose technology.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.