Abstract

Abstract This work presents the design and analysis of an Automatic adaptive user interface (AUI) that uses a novel solution for the recognition of the emotional state of a user through both facial expressions and body posture from an RGB-D sensor. Six basic emotions are recognized through facial expressions in addition to the physiological state recognized through the body posture. The facial expressions and body posture are acquired in real-time from a Kinect sensor. A scoring system is used to improve the recognition by minimizing the confusion between the different emotions. The implemented solution achieves an accuracy rate of above 90%. The recognized emotion is then used to derive an Automatic AUI where the user is provided the help automatically and can use speech commands to modify the User Interface (UI). A comprehensive user study is performed to compare the usability of the Automatic AUI with a manual system. Results show that even though the Automatic AUI is quantitatively slower and thus not as efficient, it results in a similar effectiveness and error safety compared to a manual system. In addition, the Automatic AUI results in a significantly positive user experience compared to a manual system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.