Abstract

This work proposes a low-power high-accuracy embedded hand-gesture recognition algorithm targeting battery-operated wearable devices using low-power short-range RADAR sensors. A 2-D convolutional neural network (CNN) using range-frequency Doppler features is combined with a temporal convolutional neural network (TCN) for time sequence prediction. The final algorithm has a model size of only 46 thousand parameters, yielding a memory footprint of only 92 KB. Two data sets containing 11 challenging hand gestures performed by 26 different people have been recorded containing a total of 20'210 gesture instances. On the 11 hand gesture data set, accuracies of 86.6% (26 users) and 92.4% (single user) have been achieved, which are comparable to the state of the art, which achieves 87% (10 users) and 94% (single user), while using a TCN-based network that is 7500× smaller than the state of the art. Furthermore, the gesture recognition classifier has been implemented on a parallel ultralow power processor, demonstrating that real-time prediction is feasible with only 21 mW of power consumption for the full TCN sequence prediction network, while a system-level power consumption of less than 120 mW is achieved. We provide open-source access to example code and all data collected and used in this work on tinyradar.ethz.ch.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.