The use of inertial measurement units (IMUs) in assessing fall risk is often limited by subject discomfort and challenges in data interpretation. Additionally, there is a scarcity of research on attitude estimation features. To address these issues, we explored novel features and representation methods in the context of sit-to-stand transitions. This study recorded sit-to-stand transition test data from three groups: community-dwelling elderly, elderly in day care centers (DCC), and college students, captured using mobile phone cameras. We employed pose estimation technology to extract key point kinematic features from the video data and used 10-fold cross-validation to train a random forest classifier, mitigating the impact of individual differences. We trained classifiers with the top 5, 10, and 15 features, calculating the average area under the receiver operating characteristic curve (AUC) for each model to compare feature importance. Our results indicated that elbow key point features, such as (KP08) mean Y, (KP08)RMS Y, (KP09) mean Y, and (KP09) RMS Y, are crucial for distinguishing between subject groups. Statistical tests further validated the significance of these features. The application of human pose estimation and key point signals shows promise for clinical postural balance screening. The identified features can be utilized to develop non-invasive tools for assessing postural instability risk, contributing to fall prevention efforts. This study lays the groundwork for integrating additional measurement modalities into sit-to-stand transition analysis to enhance clinical strategies.
Read full abstract