Abstract
Accurate classification of human emotions in designed spaces is essential for architects and engineers, who aim to maximize positive emotions by configuring architectural design features. Previous studies at the conjunction of neuroscience and architecture confirmed the impact of architectural design features on human emotions. Recent development of biometric sensors enabled researchers to identify emotions by measuring human physiological responses (e.g., the use of electroencephalogram (EEG) to measure brain activities). However, a gap in the knowledge exists in terms of an accurate classification model for human emotions in design variants. This study proposed a convolutional neural network (CNN) based approach to classify human emotions. The approach considered two types of CNN architectures as CNN ensemble and auto-encoders. The inputs of these CNN algorithms were 2D images generated by projecting the frequency band power of EEG onto the scalp graph in accordance with the electrode placements. This transformation from time-series EEG data to 2D frequency band power images retain the spatial, time and frequency domain features from participants’ brain dynamics. Performance of the proposed approach was validated using multiple metrics, including precision, recall, f-1 score, and Area Under Curve (AUC). Results showed that the auto-encoder based approach achieved the best performance with an AUC of 0.95.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.