Recognizing human behavior is essential for early interventions in cognitive rehabilitation, particularly for older adults. Traditional methods often focus on improving third-person vision but overlook the importance of human visual attention during object interactions. This study introduces an egocentric behavior analysis (EBA) framework that uses transfer learning to analyze object relationships. Egocentric vision is used to extract features from hand movements, object detection, and visual attention. These features are then used to validate hand-object interactions (HOI) and describe human activities involving multiple objects. The proposed method employs graph attention networks (GATs) with transfer learning, achieving 97% accuracy in categorizing various activities while reducing computation time. These findings suggest that integrating the EBA with advanced machine learning methods could revolutionize cognitive rehabilitation by offering more personalized and efficient interventions. Future research can explore real-world applications of this approach, potentially improving the quality of life for older adults through better cognitive health monitoring.
Read full abstract