The combination of Human-Computer Interaction (HCI) technology with biomimetic vision systems has transformational potential in animation design, particularly by incorporating biomechanical principles to create immersive and interactive experiences. Traditional animation approaches frequently lack sensitivity to real-time human motions, which can restrict engagement and realism. This study addresses this constraint by creating a framework that uses Virtual Reality (VR) and Augmented Reality (AR) to generate dynamic settings that include a variety of human activities, informed by biomechanical analysis. A biomimetic vision system is used to record these motions with wearable sensors, allowing for precise monitoring of user activity while considering biomechanical factors such as joint angles, force distribution, and movement patterns. The recorded data is preprocessed using Z-score normalization methods and extracted using Principal Component Analysis (PCA). This study proposed an Egyptian Vulture optimized Adjustable Long Short-Term Memory Network (EVO-ALSTM) technique for motion classification, specifically tailored to recognize biomechanical characteristics of human movements. Results demonstrate a significant improvement in precision (93%), F1-score (91%), accuracy (95%), and recall (90%) for the motion recognition system, highlighting the effectiveness of biomechanical insights in enhancing animation design. The findings indicate that integrating real-time biomechanical data into the animation process leads to more engaging and realistic user experiences. This study not only advances the subject of HCI but also provides the framework for future investigations into sophisticated animation technologies that use biomimetic and biomechanical systems.
Read full abstract