Abstract

In this paper, we propose a human pose estimation algorithm for an impulse radio ultra-wideband (IR-UWB) radar based on the transformer-based deep learning model. We have built an IR-UWB radar system with an 8-by-8 multiple-input multiple-output (MIMO) antenna array. The IR-UWB radar system in our paper is advantageous for the through-wall detection application since it operates on a very low frequency range (i.e., 0.45 to 3.55 GHz) compared to other existing works on RF-based human pose estimation. Moreover, the human pose estimation by an IR-UWB radar has not been studied very well in other existing works since all existing works have used a frequency-modulated continuous wave (FMCW) radar or a WiFi device. We propose a 3D-TransPOSE algorithm for the 3D human pose estimation from the IR-UWB radar signals. The proposed algorithm is designed based on the transformer architecture. While the transformer has actively been studied in the natural language processing (NLP) or vision domains, no prior work has applied the transformer model to the RF-based human pose estimation problem. The attention mechanism of the proposed algorithm is able to focus on the relevant time segments of the IR-UWB radar signals for detecting the human pose, eliminating the needs of converting radar signals to a voxelized 3D image. We have gathered a large dataset of IR-UWB radar signals labeled with 3D human skeletons, and shown that the proposed algorithm is able to detect human skeletons with a high accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call