Abstract

Automated sorting through chemometric analysis of plastic spectral data could be a key strategy towards improving plastic waste management. Deep learning is a promising chemometric tool, but further development through multi-modal deep learning has been limited by lack of data availability. A new Multi-modal Plastic Spectral Database (MMPSD) consisting of Fourier Transform Infrared (FTIR), Raman and Laser-induced Breakdown Spectroscopy (LIBS) data for each sample in the database is introduced in this work. MMPSD serves as the basis for novel cross-modality generative model technique termed Spectral Conversion Autoencoders (SCAE), which generates synthetic data from data of another modality. SCAE is advantageous over traditional generative models like Variational Autoencoders (VAE), as it can generate class specific synthetic data without the need to train multiple models for each data class. MMPSD also facilitated the exploration of multi-modal deep learning, which improved the classification accuracy as compared to an uni-modal approach from 0.933 to 0.970. SCAE can further be combined with multi-modal methods to achieve a higher accuracy of 0.963 while still using a single sensor to reduce costs, which can be applied for multi-modal augmentation from FTIR sensors used in industrial sorting.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call