Abstract

Pain inadequate treatment is frequent in modern society, with major medical, ethical, and financial implications. In many healthcare environments, pain is quantified prevalently through subjective measures, such as self-reports from patients or health care providers' personal experience. Recently, automatic diagnostic tools have been developed to detect and quantify pain more objectively from facial expressions. However, it is still unclear if these approaches can distinguish pain from other aversive (but painless) states. In the present study, we analyzed the facial responses from a database of video-recorded facial reactions evoked by comparably-upleasant painful and disgusting stimuli. We modeled this information as function of subjective unpleasantness, as well as the specific state evoked by the stimuli (pain vs. disgust). Results show that a machine learning algorithm could predict subjective pain unpleasantness from facial information, but mistakenly detected unpleasant disgust, especially in those models relying in great extent on the brow lowerer. Importantly, pain and disgust could be disentangled using an ad hoc algorithm that rely on combined information from the eyes and the mouth. Overall, the facial expression of pain contains both specific and unpleasantness-related information shared with disgust. Automatic diagnostic tools should be guided to account for this confounding effect.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call