Abstract
Background and ObjectivesIn order to improve assistive technologies for people with reduced mobility, this paper develops a new intelligent real-time emotion detection system to control equipment, such as electric wheelchairs (EWC) or robotic assistance vehicles. Every year, degenerative diseases and traumas prohibit thousands of people to easily control the joystick of their wheelchairs with their hands. Most current technologies are considered invasive and uncomfortable such as those requiring the user to wear some body sensor to control the wheelchair. MethodsIn this work, the proposed Human Machine Interface (HMI) provides an efficient hands-free option that does not require sensors or objects attached to the user's body. It allows the user to drive the wheelchair using its facial expressions which can be flexibly updated. This intelligent solution is based on a combination of neural networks (NN) and specific image preprocessing steps. First, the Viola-Jones combination is used to detect the face of the disability from a video. Subsequently, a neural network is used to classify the emotions displayed on the face. This solution called "The Mathematics Behind Emotion" is capable of classifying many facial expressions in real time, such as smiles and raised eyebrows, which are translated into signals for wheelchair control. On the hardware side, this solution only requires a smartphone and a Raspberry Pi card that can be easily mounted on the wheelchair. ResultsMany experiments have been conducted to evaluate the efficiency of the control acquisition process and the user experience in driving a wheelchair through facial expressions. The classification accuracy can expect 98.6% and it can offer an average recall rate of 97.1%. Thus, all these experiments have proven that the proposed system is able of accurately recognizing user commands in real time. Indeed, the obtained results indicate that the suggested system is more comfortable and better adapted to severely disabled people in their daily lives, than conventional methods. Among the advantages of this system, we cite its real time ability to identify facial emotions from different angles. ConclusionsThe proposed system takes into account the patient's pathology. It is intuitive, modern, doesn't require physical effort and can be integrated into a smartphone or tablet. The results obtained highlight the efficiency and reliability of this system, which ensures safe navigation for the disabled patient.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.