Abstract
Micro-Doppler signatures obtained from the Doppler radar are generally used for human activity classification. However, if the angle between the direction of motion and radar antenna broadside is greater than 60°, the micro-Doppler signatures generated by the radial motion of human body reduce significantly, thereby degrading the performance of the classification algorithm. For the accurate classification of different human activities irrespective of trajectory, we propose a new algorithm based on dual micro-motion signatures, namely, the micro-Doppler and interferometric micro-motion signatures, using an interferometric radar. First, the motion of different parts of the human body is simulated using motion capture (MOCAP) data, which is further utilized for radar echo signal generation. Second, time-varying Doppler and interferometric spectrograms obtained from time-frequency analysis of a single Doppler receiver and interferometric output data, respectively, are fed as input to the deep convolutional neural network (DCNN) for feature extraction and the training/testing process. The performance of the proposed algorithm is analyzed and compared with a micro-Doppler signatures-based classifier. Results show that a dual micro-motion-based DCNN classifier using an interferometric radar is capable of classifying different human activities with an accuracy level of 98%, where Doppler signatures diminish considerably, providing insufficient information for classification. Verification of the proposed classification algorithm based on dual micro-motion signatures is also performed using a real radar test dataset of different human walking patterns, and a classification accuracy level of approximately 90% is achieved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.