Abstract

Communicating different emotions to individuals who are blind through tactile feedback is an active area of research. But most work is static in nature as different facial expressions of emotions are conveyed through a fixed set of facial features which may have meaning only to those who previously had sight. To individuals who are congenitally blind, these fixed sets of information are abstract, and little research reflects how this population can properly interpret and relate these fixed sets of signs as per their own nonvisual experience. Our goal is to develop a complete system that integrates feature extraction with haptic recognition. As emotion detection through image and video analysis often fails, we give emphasis on active exploration of facial expressions of one's self so that the movement of facial features and expressions becomes meaningful to users toward becoming proficient at interpreting facial expressions related to different emotions. We propose a dynamic haptic environment where an individual who is blind can perceive the reflection of his own facial movements to better understand and explore different facial expressions on the basis of the movement of his own facial features.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call