Abstract

BackgroundIntelligent garments, a burgeoning class of wearable devices, have extensive applications in domains such as sports training and medical rehabilitation. Nonetheless, existing research in the smart wearables domain predominantly emphasizes sensor functionality and quantity, often skipping crucial aspects related to user experience and interaction. MethodsTo address this gap, this study introduces a novel real-time 3D interactive system based on intelligent garments. The system utilizes lightweight sensor modules to collect human motion data and introduces a dual-stream fusion network based on pulsed neural units to classify and recognize human movements, thereby achieving real-time interaction between users and sensors. Additionally, the system in- corporates 3D human visualization functionality, which visualizes sensor data and recognizes human actions as 3D models in realtime, providing accurate and comprehensive visual feedback to help users better understand and analyze the details and features of human motion. This system has significant potential for applications in motion detection, medical monitoring, virtual reality, and other fields. The accurate classification of human actions con- tributes to the development of personalized training plans and injury prevention strategies. ConclusionsThis study has substantial implications in the domains of intelligent garments, human motion monitoring, and digital twin visualization. The advancement of this system is expected to propel the progress of wearable technology and foster a deeper comprehension of human motion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call