Abstract
Pain assessment in children continues to challenge clinicians and researchers, as subjective experiences of pain require inference through observable behaviors, both involuntary and deliberate. The presented approach supplements the subjective self-report-based method by fusing electrodermal activity (EDA) recordings with video facial expressions to develop an objective pain assessment metric. Such an approach is specifically important for assessing pain in children who are not capable of providing accurate self-pain reports, requiring nonverbal pain assessment. We demonstrate the performance of our approach using data recorded from children in post-operative recovery following laparoscopic appendectomy. We examined separately and combined the usefulness of EDA and video facial expression data as predictors of children's self-reports of pain following surgery through recovery. Findings indicate that EDA and facial expression data independently provide above chance sensitivities and specificities, but their fusion for classifying clinically significant pain vs. clinically nonsignificant pain achieved substantial improvement, yielding 90.91% accuracy, with 100% sensitivity and 81.82% specificity. The multimodal measures capitalize upon different features of the complex pain response. Thus, this paper presents both evidence for the utility of a weighted maximum likelihood algorithm as a novel feature selection method for EDA and video facial expression data and an accurate and objective automated classification algorithm capable ofdiscriminating clinically significant pain from clinically nonsignificant pain in children.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.