Abstract

The assurance of personal thermal comfort is a challenging and significant issue for achieving energy savings in buildings. As age, gender, and human skin temperature have all been proved to be correlated to thermal comfort, we propose a new non-intrusive method of predicting personal thermal comfort by extracting these three parameters from thermal images. This study focuses on one key step of the method, recognizing age and gender from thermal images. We established a dataset of 3000 thermal images and 3000 visible-light images with a balanced distribution of age and gender and conducted three pre-processing experiments regarding thermal image processing, sampling strategy, and data augmentation. Then, we applied four convolutional neural networks (CNN) to detect age and gender from thermal images and visible-light images, respectively. Finally, we used class activation mapping (CAM) to understand how CNN recognizes age and gender from thermal images. ResNet-50 achieved a gender accuracy of 95.3%, while EfficientNet provided a one-off age accuracy of 73.1%. These accuracy rates are higher than or comparable to those in the existing literature. Moreover, thermal images performed better than visible-light pictures as they offered similar classification accuracy, required less training time, and provided skin temperatures that visible-light images cannot provide CNN for learning. Besides, image view distance and view angle did not influence classification accuracy, provided some crucial facial features were exposed to the thermal cameras. This study demonstrates the feasibility of using CNN to detect age and gender from thermal images, laying a reliable foundation for the proposed automatic prediction of personal thermal comfort.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.