Abstract
In this paper, we developed and integrated an AI-edge emotion recognition platform using multiple wearable physiological signals sensors: Electroencephalogram (EEG), electrocardiogram (ECG), and photoplethysmogram (PPG) sensors. The emotion recognition platform used two combined machine learning approaches based on two systems input and preprocessing: An EEG-based emotion recognition system and an ECG/PPG-based system. The EEG-based system is a convolution neural network (CNN) that classifies three emotions, happiness, anger and sadness. The inputs of the CNN are extracted from the EEG signals using short-time Fourier transform (STFT), and the average accuracy for a subject-independent classification reached 76.94%. The ECG/PPG-based system used a similar CNN with an extracted features vector as input. The subject-dependent ECG/PPG classification system reached an average accuracy of 76.8%. The proposed system was integrated using the RISC-V processor and FPGA platforms to implement realtime monitoring and classification on edge. A 3-to-1 Bluetooth piconet was deployed to transmit all physiological signals on a single platform access point and to make use of low power wireless technologies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.