Abstract

Advancement in technology has ensured better and more intuitive Human Computer Interaction (HCI). However, technological leap has not been able to bridge the gap for individuals with disability, particularly people with different impairments. Work has been done towards translating various sign languages and lip reading, but not much work is identified towards head gesture recognition and head gesture to speech conversion. Paralyzed patients, with ability to only move their head and eye find it rather difficult to communicate and interact with machines. This work presents a novel and efficient technique to map restricted head movements to text format using face and emotion detection capabilities of Intel RealSense SDK using Morse Code Mapping. The work maps UP and DOWN position of the head to DASH and DOT symbols of Morse code for an EYE_BLINK event trigger and SMILE emotion to convert the sequence of Morse symbols to an English character. The new language called HeadSpeak effectively reduces the pain for generating head gestures through 3D face position based gesture detection.The system can produce an average of 19 characters par minute, gesture error rate of .03% and average 4.5 words per minute.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.