Abstract

Sensor-based human activity recognition (HAR) can provide users with more convenience and security than machine vision-based methods. The ideal HAR application would achieve high-accuracy and low-latency performance with minimal hardware consumption. However, so far, the speed-accuracy tradeoff in mobile device-based HAR has rarely been systematically analyzed and discussed. In contrast to the static networks of existing deep HAR models, this article aims to design a resource-efficient dynamic network, called RepHAR, for low-cost hardware-constrained HAR tasks. This article first combines multibranch (MB) topologies and structured reparameterization techniques for HAR. Specifically, RepHAR decouples the model into a training time and inference time architecture using the idea of structural reparameterization. That is to say, the model has a MB topology at training time, but it no longer has the same model structure at inference time and is transformed into a plain convolutional neural network (CNN) structure. As a result, RepHAR achieves a trade-off between speed and accuracy by simultaneously obtaining the accuracy gain of a MB network and the high-speed inference of a plain CNN. On four publicly available datasets (including UCI-HAR, PAMAP2, UNIMIB-SHAR, and OPPORTUNITY), RepHAR can achieve 0.83%–2.18% accuracy improvement and 12%–44% parameter reduction, which demonstrates the effectiveness of the proposed method. Finally, on the embedded Raspberry Pi, the RepHAR runs 72% faster than the MB CNN model, demonstrating its usefulness and practicality. Our model and codes will be released soon.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.