Abstract
Human activity recognition (HAR) has been known as an active area for more than a decade, and there are still crucial aspects that are intended as challenging problems. Providing detailed and appropriate information about the activities and behaviors of people is one of the most important fields in ubiquitous computing. There are numerous applications in this field, among which healthcare, security, and entertainment scenarios can be listed. Human activity recognition can be carried out with the assistance of smartphone sensors such as accelerometers and gyroscopes or images captured from webcams. Today, the application of deep neural networks in this domain has received much attention and has led to more accurate and effective results compared to traditional techniques. The deep neural network performs arithmetic operations on a various number of hidden layers. In this article, a new approach called HAR-CT is proposed to enhance the accuracy of human activity recognition in various classes by adopting a convolutional neural network (CNN). Subsequently, an optimization technique using the TWN model is also suggested to reduce the complexity of the deep neural network approach that decreases the energy consumption of mobile devices. To this end, the float precision weights of the convolutional neural network are quantized and converted into ternary weights, while the decline in the accuracy is very low compared to the initial deep neural network. The evaluation results of both networks demonstrate that the proposed methods outperform the recently published approaches in human activity recognition.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.