Abstract

In computer vision and image processing, the shift from traditional cameras to emerging sensing tools, such as gesture recognition and object detection, addresses privacy concerns. This study navigates the Integrated Sensing and Communication (ISAC) era, using millimeter-wave signals as radar via a Convolutional Neural Network (CNN) model for event sensing. Our focus is on leveraging deep learning to detect security-critical gestures, converting millimeter-wave parameters into point cloud images, and enhancing recognition accuracy. CNNs present complexity challenges in deep learning. To address this, we developed flexible quantization methods, simplifying You Only Look Once (YOLO)-v4 operations with an 8-bit fixed-point number representation. Cross-simulation validation showed that CPU-based quantization improves speed by 300% with minimal accuracy loss, even doubling the YOLO-tiny model’s speed in a GPU environment. We established a Raspberry Pi 4-based system, combining simplified deep learning with Message Queuing Telemetry Transport (MQTT) Internet of Things (IoT) technology for nursing care. Our quantification method significantly boosted identification speed by nearly 2.9 times, enabling millimeter-wave sensing in embedded systems. Additionally, we implemented hardware-based quantization, directly quantifying data from images or weight files, leading to circuit synthesis and chip design. This work integrates AI with mmWave sensors in the domain of nursing security and hardware implementation to enhance recognition accuracy and computational efficiency. Employing millimeter-wave radar in medical institutions or homes offers a strong solution to privacy concerns compared to conventional cameras that capture and analyze the appearance of patients or residents.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call