Abstract

With the popularity of personal computing devices, people often keep long-term head immobility in front of screens, resulting in the emergence of “phubbers” and “office workers.” The early warning solutions in the Internet of Healthcare Things (IoHT) have brought hope to protect users’ health and safety. However, most existing works cannot recognize the different head gestures during walking, which is also a common cause of text neck and traffic accidents. In addition, they also need a large amount of data to update the model to adapt to the new environment, which reduces the practicality of the model. To solve these problems, we propose a system, CSEar, based on built-in accelerometers of off-the-shelf wireless earphones, which can recognize 12 kinds of head gestures both in resting and walking states. First, an innovative algorithm is designed to detect head gesture signals, especially for the signals mixed with gait. Then, we propose the MetaSensing, a head gesture recognition model that can improve the recognition ability with few samples compared with the existing metalearning algorithms. Finally, the experimental results prove the effectiveness and robustness of the CSEar.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call