Abstract

Unconstrained human activities recognition with a radar network is considered. A hybrid classifier combining both CNNs and RNNs for spatial-temporal pattern extraction is proposed. The two-dimensional CNNs (2D-CNNs) are first applied to the radar data to perform spatial feature extraction on the input spectrograms. Subsequently, gated recurrent units with bidirectional implementations are used to capture the long- and short-term temporal dependencies in the feature maps generated by the 2D-CNNs. Three NN-based data fusion methods were explored and compared to utilize the rich information provided by the different radar nodes. The performance of the proposed classifier was validated rigorously using the K-fold CV and L1PO method. Unlike competitive research, the dataset with continuous human activities with seamless inter-activity transitions that can occur at any time and unconstrained moving trajectories of the participants has been collected and used for evaluation purposes. Classification accuracy of about 90.8% is achieved for nine-class HAR by the proposed classifier with the halfway fusion method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call