Abstract

A human-computer interaction nursing system is developed for the severely paralyzed patient, wh o cannot move their limb and body and has language barriers, but has clear consciousness and can make head and face movements. If the system can obtain the patient's real intention, the system will enable the patient own self-care ability to some extent through the rehabilitation mechanical device. There are two ways to obtain the real intention of the patient: the active and passive ways. The first way, the patient with clear consciousness can send demands to the system through the head and face movements; the second way, the system can caluculate the patient's real needs according to the patient's emotional state. During the study, the key problems need to be solved include: identifying the patient's expression, head and face movements accurately. Aiming at these problems, the system recognizes the head movement by the offset direction of the face's center position; the face movement is recognized using the Uniform Local Binary Pattern texture feature map; about the facial expression recognition of the patient, the system trains the patient's expression recognition model through the convolutional neural networks, and then identifies the patient's facial expression by the model. The experiment results indicate that the system can detect the human's face, cut the human's eye and mouth areas, identify the human's facial expression, head and face movements accurately.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call