Abstract

Touchscreens are essential parts of many electronics in daily lives of sighted people in the digital information era. On the other hand, visually impaired users rely on tactile displays as one of the key communication devices to interact with the digital world. However, due to their working mechanism and the uneven surface of tactile displays, one of the key features of screens for sighted users is surprisingly challenging to implement: precision touch input. To overcome this, a hand gesture recognition system is developed using a frequency‐modulated continuous wave millimeter‐wave radar. A multifeature encoder method is used to obtain the range and velocity information from the radar to translate the data into spectrogram images. Gesture recognition is implemented for common input gestures: single/double‐click, swipe‐right/left, scroll‐up/down, zoom‐in/out, and rotate‐anticlockwise/clockwise. The gesture recognition and classification are based on machine learning, support vector machines, deep learning, and convolutional neural network approaches. The chosen model You‐Only‐Look‐Once (YOLOv8) shows a high accuracy of 97.1% by iterating only 30 epochs with only 500 collected data samples per gesture. This research paves the way toward using radar sensors not only for tactile displays but also for other digital devices in human–computer interaction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.