Abstract

In this study, we investigated expressive facial reactions in response to changes in the visual environment and their automatic extraction from sensors, in order to construct a comfortable level of illumination in personal living spaces. We conducted an experiment that showed that expressive facial reactions occur when illumination in the visual environment changes. We captured facial images and manually classified them as expressing or not expressing discomfort. We then conducted a second experiment that showed that automatic image processing can be used to extract and identify these expressive facial reactions. We extracted facial features and used a support vector machine to learn the classification in this experiment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call