Abstract

To build a smart home that assists people with disabilities, we urgently need a non-invasive user identification system capable of recognizing family members in real-time and being easy to use. Existing systems face issues such as privacy concerns from cameras, inconvenience from needing physical contact with sensors, and the requirement to always carry a specific device. Additionally, these systems typically depend on edge nodes to collect data and then transfer it to cloud servers for high-performance inference. This dependency leads to network delays, hindering real-time service, and introducing security issues. To address these requirements, this paper presents a real-time non-invasive user identification system that recognizes users as they step on a foot pad. This study introduces an edge node designed to measure real-time foot pressure distribution data, along with a preprocessing system for data generalization. Additionally, we propose a system that performs real-time user inference using only resource-constrained edge nodes to overcome the challenges of cloud-based systems, including addressing security issues without specific protocols. To achieve this, we optimized various deep learning-based user identification models to be executable on edge nodes and then compared their performance. As a result, using a ResNet18 model through pruning and post quantization training with integer, we achieved inference within 1.5 s with 85% accuracy. Compared to the worst-performing AlexNet, the ResNet18 model shows a substantial reduction in model size by approximately 33%, a decrease in memory usage by about 80%, and a significant increase in inference speed by over tenfold.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.