Abstract
This project presents a system to automatically detect emotional dichotomy and mixed emotional experience using a Linux based system. Facial expressions, head movements and facial gestures were captured from pictorial input in order to create attributes such as distance, coordinates and movement of tracked points. Web camera is used to extract spectral attributes. Features are calculated using Fisher face algorithm. Emotion detected by cascade classifier and feature level fusion was used to create a combined feature vector. Live actions of user are to be used for recording emotions. As per calculated result system will play songs and display books list.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have