Abstract

This paper presents a facial expression recognition approach to recognize the affective states. Feature extraction is a vital step in the recognition of facial expressions. In this work, a novel facial feature extraction method based on Intersecting Cortical Model (ICM) is proposed. The ICM network which is a simplified model of Pulse-Coupled Neural Network (PCNN) model has great potential to perform pixel grouping. In the proposed method the normalized face image is segmented into two regions including mouth, eyes using fuzzy c-means clustering (FCM). Segmented face images are imported into an ICM network with 300 iteration number and pulse image produced by the ICM network is chosen as the face code, then the support vector machine (SVM) is trained for discrimination of different expressions to distinguish the different affective states. In order to evaluate the performance of the proposed algorithm, the face image dataset is constructed and the proposed algorithm is used to classify seven basic expressions including happiness, sadness, fear, anger, surprise and hate The experimental results confirm that ICM network has great potential for facial feature extraction and the proposed method for human affective recognition is promising. Fast feature extraction is the most advantage of this method which can be useful for real world application.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.