Abstract

In order to achieve better results of human complex activity recognition, the optimal feature representation of human complex activities from sensor data are studied. Features from multiple sets that are obtained by the learned networks, the manual extraction, and the location information are fused to generate mixed features to achieve better results of human complex activity recognition. A method extracting multi-layer features from the hybrid CNN and BLSTM network is proposed. The output feature vector of the second layer of the CNN and that of the BLSTM are combined to generate multi-layer features. Meanwhile, a new feature selection method based on SFS and network weight analysis is proposed. The method is to select a series of features extracted from the segmented sensor data using manual methods as dominant features. The location information of complex activities after one-hot encoding is also used as feature, which is then fused with multi-layer features and manual dominant features to generate mixed features. To further improve the recognition performance, the sensor data collected at different body positions are separated and used independently to train the hybrid CNN and BLSTM network to obtain their respective state information. The PAMAP2 and UT-Data datasets are used in our experiments to verify the proposed method. The results show that the method based on multiple features proposed in this paper dramatically outperforms the existing state-of-the-art methods for human complex activity recognition using sensor data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.