Abstract

As far as low-cost deployment is concerned, wireless network-based device-free sensing (DFS) is of great interest and has successfully demonstrated the feasibility in Foliage penetration (FOPEN) target recognition. The classification accuracy of this technology is known to dramatically decrease in extreme climates where the received signals tend to be severely attenuated; while deep learning approaches have boosted performance, they only perform effectively when trained with large amounts of labeled data. Consequently, it is still unknown how to ensure reasonable detection accuracy in extreme climates where sufficient samples are difficult to obtain. To address this concern, we adopt two special measures for performance enhancement in this paper. One measure is to employ higher-order spectral (HOS) analysis to transform the time-domain signals into the bispectrum image representations, so that the shift to an image classification task could provide the advantage of using the existing Convolutional Neural Network (CNN) models. More importantly, the immunity of the approach against the unwanted clutters in foliage environments can be improved. The other one is to present an end-to-end Deep Learning Data Augmentation and Classification (DLDAC) model comprised of a Deep Convolutional Generative Adversarial Network (for data augmentation) and a SqueezeNet CNN backbone (for target classification), which can improve the classifier performance by using the augmented data on-the-fly. Thus, the negative impacts of low data regimes in extreme climates can be considerably accommodated. To evaluate the effectiveness of the proposed approach, comprehensive experiments are conducted on a real FOPEN dataset collected by impulse-radio ultra-wideband (IR-UWB) transceivers under three severe weather conditions. The experimental results demonstrate that even when only 300 training samples are taken for each type of target under every weather condition, the average classification accuracy of the proposed approach is still better than 92% in terms of distinguishing between human and other targets.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.