Abstract

Estimating pain levels is crucial for patients with serious illnesses, those recovering from brain surgery, and those receiving intensive care etc. An automatic pain intensity estimator is proposed in this study that gathers information about pain and intensity from the user’s expressions. The faces in the database are first cropped using a ‘Chehra’ face detector, which performs well even in wildly uncontrolled environments with a wide range of lighting and position fluctuations. The suggested technique extracts the beneficial and distinct patterns from facial expressions using novel Statistical Frei-Chen Mask (SFCM)-based features and DenseNet-based features. As it offers quick as well as accurate pain identification and pain intensity estimation, the Radial Basis Function Based Extreme Learning Machine (RBF-ELM) is employed for pain recognition and pain intensity level estimation using the characteristics. All the data is kept, updated and protected in the cloud because availability and high-performance decision-making are so important for informing physicians and auxiliary IoT nodes (such as wearable sensors). In addition, cloud computing reduces the time complexity of the training phase of Machine Learning algorithms in situations where it is possible to build a complete cloud/edge architecture by allocating additional computational resources and memory in use. The facial expression images from the UNBC-McMaster Shoulder Pain Expression Archive and 2D face dataset are used to test the proposed method. The measurement of pain intensity uses four stages. When compared to the results from the literature, the proposed work attains enhanced performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.