Abstract

Recognizing human activity from highly sparse body sensor data is becoming an important problem in Internet of Medical Things (IoMT) industry. As IoMT currently uses batteryless or passive wearable body sensors for activity recognition, the data signals of these sensors have a high sparsity, which means that the time intervals of sensors’ readings are irregular and the number of sensors’ readings in time unit is frequently limited. Therefore, learning activity recognition models from temporally sparse data is challenging. Traditional machine learning techniques cannot be applicable in this scenario due to their focus on a single sensing modality requiring regular sampling rate. In this work, we propose an effective end-to-end deep neural network (DNN) model to recognize human activities from temporally sparse data signals of passive wearable sensors that improve the accuracy rate of human activity recognition. A dropout technique is used in the developed model to deal with sparsity problem and avoiding overfitting problem. In addition, optimization of the proposed DNN model was performed by evaluating a different number of hidden layers. Various experiments were conducted on a public clinical room dataset of sparse data signals to compare the performance of the proposed DNN model with the conventional and other deep learning approaches. The experimental results demonstrate that the proposed DNN model outperforms the existing state-of-the-art methods in terms of lower inference delay and activity recognition accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.