Abstract

Automatic segmentation is essential for enhancing human activity recognition, especially given the limitations of publicly available datasets that often lack diversity in daily activities. This study introduces a novel segmentation method that utilizes skeleton data for a more accurate and efficient analysis of human actions. By employing an autoencoder, this method extracts representative features and reconstructs the dataset, using the discrepancies between the original and reconstructed data to establish a segmentation threshold. This innovative approach allows for the automatic segmentation of activity datasets into distinct segments. Rigorous evaluations against ground truth across three publicly available datasets demonstrate the method’s effectiveness, achieving impressive average annotation error, precision, recall, and F1-score values of 3.6, 90%, 87%, and 88%, respectively. This illustrates the robustness of the proposed method in accurately identifying change points and segmenting continuous skeleton-based activities as compared to two other state-of-the-art techniques: one based on deep learning and another using the classical time-series segmentation algorithm. Additionally, the dynamic thresholding mechanism enhances the adaptability of the segmentation process to different activity dynamics improving overall segmentation accuracy. This performance highlights the potential of the proposed method to significantly advance the field of human activity recognition by improving the accuracy and efficiency of identifying and categorizing human movements.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.