BACKGROUND: Facial neuropathy at the peripheral level (unilateral muscular weakness of the entire half of the face) is a common neurological disorder. Assessment of the facial nerve dysfunction grade is necessary to track the dynamics of treatment and monitor the effectiveness of rehabilitation. For this purpose, worldwide clinical practice uses grading systems, the most popular of which are the House-Brackmann, Yanagihara, and Nottingham scales. Such methods are non-universal and based on visual diagnosis, which relies solely on the subjective experience of the physician. Consequently, objective measurements and automation are needed to track the dynamics of recovery. With the use of image processing and computer vision techniques, this task became feasible.
 AIM: To develop a method of automated assessment of the facial nerve dysfunction grade by biometric facial analysis to monitor the patients recovery dynamics.
 METHODS: As part of the collaboration with the Herzen Moscow Research Institute of Oncology, a database of target group patients with grades IV (4 people), V (4 people), and VI (11 people) of facial nerve disfunction according to the Haus-Brackmann scale was compiled. A control group consisted of 20 students from the Bauman Moscow State Technical University. During registration, subjects were asked to perform a series of following mimic tests: raising the eyebrows, closing the eyes, smiling, smiling with effort, inflating the cheeks, pouting the lips, and articulating with effort. Control points of the eyebrow, eye, and mouth areas were used to assess the degree of facial asymmetry. The two-dimensional MultiPIE model, implemented in the dlib library and containing 68 control points, was used as a facial model. A program code in Python was written, which calculates asymmetry coefficients based on changes in the coordinates of control points when the patient performs mimic tests.
 RESULTS: A study was conducted to determine statistically significant differences between asymmetry coefficients in the control group and patients. Based on the Mann-Whitney criterion, asymmetry parameters during some mimic tests showed statistically significant differences (p 0.05). Thus, asymmetries in the forehead when raising the eyebrows (0.00 0.05), in the mouth when smiling (0.026 0.05), in the mouth when smiling with effort (0.00 0.05), in the mouth when pouting the lips (0.039 0.05), and in the mouth when articulating with effort (0.004 0.05) were revealed.
 CONCLUSIONS: The results prove the performance of the proposed method and show the need for additional research, particularly the search for differences between groups of patients with different severity and the development of a classification model for machine learning.
Read full abstract