Abstract

Remote Health Monitoring (RHM) appears as a promising solution for continuous and better healthcare, especially in the rural areas where the basic medical facilities are scarce or often out of reach. Generally, the main focus is on the monitoring of physiological signals (i.e. ECG, EEG) or activities. However, the overall emotional state of the remotely monitored patients is often underestimated due to the complexity of this task. Consequently, it is crucial to propose RHM -suitable technologies for explainable and reliable emotional state recognition. In this work, a free-position model to detect emotion and facial action units (FAU) is proposed. The proposed method has been validated on the public CK + dataset and shows very promising results in terms of the explainability of the obtained results and performances, which are competitive in comparison with the state-of-the-art (an average accuracy of 93.4 % over all AUs). In addition, the lightweight architecture of the proposed method as well as its low number of parameters, in comparing with other FER models, make it suitable for resource-constrained embedded systems and open the ways for its wide adoption in remote health monitoring applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.