Abstract

Classification of human activities from wearable sensor data is challenged by inter-subject variance and resource-constrained platforms. We address these issues with SincEMG, a deep neural network that exploits digital signal processing concepts and transfer learning to reduce model size for activity recognition on raw sensor data. The model’s first layer decomposes signals into frequency bands using finite impulse response filters optimized directly from the data. The subsequent convolutional layers downsample across time and aggregate the first layer’s band data. Batch normalization and dropout help to regularize intermediate layer outputs. This approach reduces compute requirements by decreasing the number of learned parameters and eliminating any significant data pre-processing. In addition to these improvements, the model’s first layer learns a set of bandpass filters, which provide insight into predictive regions of the source spectrum. We evaluate SincEMG using two publicly available surface electromyography datasets. Our model uses far fewer parameters and achieves state-of-the-art results with 98.53% accuracy for 7-classes and 68.45% accuracy for 18-classes.

Highlights

  • Human activity recognition (HAR) enables rich user experiences for many applications, with the potential to improve the livelihood for persons with disabilities [1], [2]

  • HAR approaches rely on mobile or internet-of-things (IoT) [3] devices equipped with sensors continuously monitoring the subject

  • In pursuit of improved performance, popular deep learning [5] architectures are applied to sensor data for activity recognition [6]

Read more

Summary

Introduction

Human activity recognition (HAR) enables rich user experiences for many applications, with the potential to improve the livelihood for persons with disabilities [1], [2]. HAR approaches rely on mobile or internet-of-things (IoT) [3] devices equipped with sensors continuously monitoring the subject. In order to expand practical use-cases, recognition using this data is performed on-device, which minimizes response time and eliminates reliance on external systems [4]. Activity recognition models are motivated to reduce model complexity while still maintaining acceptable recognition performance. In pursuit of improved performance, popular deep learning [5] architectures are applied to sensor data for activity recognition [6]. Architectures such as convolutional neural networks (CNNs) [7]–[9], recurrent neural networks (RNNs) [10], or a combination [11]–[13] have been used to classify

Methods
Findings
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call