Abstract

With the recent surge of smart wearable devices, it is possible to obtain the physiological and behavioral data of human beings in a more convenient and non-invasive manner. Based on such data, researchers have developed a variety of systems or applications to recognize and understand human behaviors, including both physical activities (e.g., gestures) and mental states (e.g., emotions). Specifically, it has been proved that different emotions can cause different changes in physiological parameters. However, other factors, such as activities, may also impact one’s physiological parameters. To accurately recognize emotions, we need not only explore the physiological data but also the behavioral data. To this end, we propose an adaptive emotion recognition system by exploring a sensor-enriched wearable smart watch. First, an activity identification method is developed to distinguish different activity scenes (e.g., sitting, walking, and running) by using the accelerometer sensor. Based on the identified activity scenes, an adaptive emotion recognition method is proposed by leveraging multi-mode sensory data (including blood volume pulse, electrodermal activity, and skin temperature). Specifically, we extract fine-grained features to characterize different emotions. Finally, the adaptive user emotion recognition model is constructed and verified by experiments. An accuracy of 74.3% for 30 participants demonstrates that the proposed system can recognize human emotions effectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.