Abstract

Pain is an unpleasant feeling that can reflect a patient's health situation. Since measuring pain is subjective, time-consuming, and needs continuous monitoring, automated pain intensity detection from facial expression holds great potential for smart healthcare applications. Convolutional Neural Networks (CNNs) are recently being used to identify features, map and model pain intensity from facial images, delivering great promise in helping practitioners detect disease. Limited research has been conducted to determine pain intensity levels across multiple classes. CNNs with simple learning schemes are limited in their ability to extract feature information from images. In order to develop a highly accurate pain intensity estimation system, this study proposes a Deep CNN (DCNN) model using the transfer learning technique, where a pre-trained DCNN model is adopted by replacing its dense upper layers, and the model is tuned using painful facial. We conducted experiments on the UNBC-McMaster shoulder pain archive database to estimate pain intensity in terms of seven-level thresholds using a given facial expression image. The experiments show our method achieves a promising improvement in terms of accuracy and performance to estimate pain intensity and outperform the-state-of-the-arts models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call