Abstract

This study aimed to assess whether an artificial intelligence model based on facial expressions can accurately predict significant postoperative pain. A total of 155 facial expressions from patients who underwent gastric cancer surgery were analyzed to extract facial action units (AUs), gaze, landmarks, and positions. These features were used to construct various machine learning (ML) models, designed to predict significant postoperative pain intensity (NRS ≥ 7) from less significant pain (NRS < 7). Significant AUs predictive of NRS ≥ 7 were determined and compared to AUs known to be associated with pain in awake patients. The area under the receiver operating characteristic curves (AUROCs) of the ML models was calculated and compared using DeLong's test. AU17 (chin raising) and AU20 (lip stretching) were found to be associated with NRS ≥ 7 (both P ≤ 0.004). AUs known to be associated with pain in awake patients did not show an association with pain in postoperative patients. An ML model based on AU17 and AU20 demonstrated an AUROC of 0.62 for NRS ≥ 7, which was inferior to a model based on all AUs (AUROC = 0.81, P = 0.006). Among facial features, head position and facial landmarks proved to be better predictors of NRS ≥ 7 (AUROC, 0.85-0.96) than AUs. A merged ML model that utilized gaze and eye landmarks, as well as head position and facial landmarks, exhibited the best performance (AUROC, 0.90) in predicting significant postoperative pain. ML models using facial expressions can accurately predict the presence of significant postoperative pain and have the potential to screen patients in need of rescue analgesia. This study was registered at ClinicalTrials.gov (NCT05477303; date: June 17, 2022).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call