Abstract

With increasing numbers of people living with dementia, there is growing interest in the automatic monitoring of agitation. Current assessments rely on carer observations within a framework of behavioural scales. Automatic monitoring of agitation can supplement existing assessments, providing carers and clinicians with a greater understanding of the causes and extent of agitation. Despite agitation frequently manifesting in repetitive hand movements, the automatic assessment of repetitive hand movements remains a sparsely researched field. Monitoring hand movements is problematic due to the subtle differences between different types of hand movements and variations in how they can be carried out; the lack of training data creates additional challenges. This paper proposes a novel approach to assess the type and intensity of repetitive hand movements using skeletal model data derived from video. We introduce a video-based dataset of five repetitive hand movements symptomatic of agitation. Using skeletal keypoint locations extracted from video, we demonstrate a system to recognise repetitive hand movements using discriminative poses. By first learning characteristics of the movement, our system can accurately identify changes in the intensity of repetitive movements. Wide inter-subject variation in agitated behaviours suggests the benefit of personalising the recognition model with some end-user information. Our results suggest that data captured using a single conventional RGB video camera can be used to automatically monitor agitated hand movements of sedentary patients.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call